Oct 04 04:46:13 crc systemd[1]: Starting Kubernetes Kubelet... Oct 04 04:46:13 crc restorecon[4573]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:13 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 04 04:46:14 crc restorecon[4573]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 04 04:46:14 crc kubenswrapper[4574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 04:46:14 crc kubenswrapper[4574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 04 04:46:14 crc kubenswrapper[4574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 04:46:14 crc kubenswrapper[4574]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 04:46:14 crc kubenswrapper[4574]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 04 04:46:14 crc kubenswrapper[4574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.508056 4574 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.510815 4574 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.510886 4574 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.510948 4574 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.510992 4574 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511034 4574 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511075 4574 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511121 4574 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511164 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511210 4574 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511278 4574 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511332 4574 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511378 4574 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511420 4574 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511462 4574 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511503 4574 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511545 4574 feature_gate.go:330] unrecognized feature gate: Example Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511586 4574 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511631 4574 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511674 4574 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511714 4574 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511755 4574 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511802 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511852 4574 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511895 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.511963 4574 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512028 4574 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512073 4574 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512114 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512154 4574 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512198 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512256 4574 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512307 4574 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512350 4574 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512396 4574 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512437 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512478 4574 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512519 4574 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512564 4574 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512609 4574 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512655 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512696 4574 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512736 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512777 4574 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512821 4574 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512862 4574 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512906 4574 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512947 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.512986 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513027 4574 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513067 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513107 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513146 4574 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513191 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513251 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513303 4574 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513345 4574 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513385 4574 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513427 4574 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513470 4574 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513517 4574 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513562 4574 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513605 4574 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513647 4574 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513692 4574 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513734 4574 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513776 4574 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513821 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513862 4574 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513902 4574 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513947 4574 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.513989 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514100 4574 flags.go:64] FLAG: --address="0.0.0.0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514152 4574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514277 4574 flags.go:64] FLAG: --anonymous-auth="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514338 4574 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514383 4574 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514427 4574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514472 4574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514520 4574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514565 4574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514617 4574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514662 4574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514705 4574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514753 4574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514796 4574 flags.go:64] FLAG: --cgroup-root="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514842 4574 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514885 4574 flags.go:64] FLAG: --client-ca-file="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514932 4574 flags.go:64] FLAG: --cloud-config="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.514976 4574 flags.go:64] FLAG: --cloud-provider="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515019 4574 flags.go:64] FLAG: --cluster-dns="[]" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515068 4574 flags.go:64] FLAG: --cluster-domain="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515125 4574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515209 4574 flags.go:64] FLAG: --config-dir="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515285 4574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515341 4574 flags.go:64] FLAG: --container-log-max-files="5" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515389 4574 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515432 4574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515475 4574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515518 4574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515561 4574 flags.go:64] FLAG: --contention-profiling="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515603 4574 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515655 4574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515703 4574 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515751 4574 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515799 4574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515846 4574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515894 4574 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515936 4574 flags.go:64] FLAG: --enable-load-reader="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.515978 4574 flags.go:64] FLAG: --enable-server="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516032 4574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516090 4574 flags.go:64] FLAG: --event-burst="100" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516144 4574 flags.go:64] FLAG: --event-qps="50" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516191 4574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516248 4574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516318 4574 flags.go:64] FLAG: --eviction-hard="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516382 4574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516459 4574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516517 4574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516576 4574 flags.go:64] FLAG: --eviction-soft="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516628 4574 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516671 4574 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516721 4574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516789 4574 flags.go:64] FLAG: --experimental-mounter-path="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516835 4574 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.516878 4574 flags.go:64] FLAG: --fail-swap-on="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517194 4574 flags.go:64] FLAG: --feature-gates="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517268 4574 flags.go:64] FLAG: --file-check-frequency="20s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517318 4574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517370 4574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517427 4574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517474 4574 flags.go:64] FLAG: --healthz-port="10248" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517527 4574 flags.go:64] FLAG: --help="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517575 4574 flags.go:64] FLAG: --hostname-override="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517618 4574 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517661 4574 flags.go:64] FLAG: --http-check-frequency="20s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517704 4574 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517745 4574 flags.go:64] FLAG: --image-credential-provider-config="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517787 4574 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517829 4574 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517876 4574 flags.go:64] FLAG: --image-service-endpoint="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517920 4574 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.517961 4574 flags.go:64] FLAG: --kube-api-burst="100" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518003 4574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518052 4574 flags.go:64] FLAG: --kube-api-qps="50" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518108 4574 flags.go:64] FLAG: --kube-reserved="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518163 4574 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518222 4574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518382 4574 flags.go:64] FLAG: --kubelet-cgroups="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518440 4574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518510 4574 flags.go:64] FLAG: --lock-file="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518569 4574 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518626 4574 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518689 4574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518757 4574 flags.go:64] FLAG: --log-json-split-stream="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518818 4574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518875 4574 flags.go:64] FLAG: --log-text-split-stream="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518935 4574 flags.go:64] FLAG: --logging-format="text" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.518991 4574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519048 4574 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519095 4574 flags.go:64] FLAG: --manifest-url="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519138 4574 flags.go:64] FLAG: --manifest-url-header="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519188 4574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519247 4574 flags.go:64] FLAG: --max-open-files="1000000" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519296 4574 flags.go:64] FLAG: --max-pods="110" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519338 4574 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519382 4574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519433 4574 flags.go:64] FLAG: --memory-manager-policy="None" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519478 4574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519526 4574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519571 4574 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519614 4574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519663 4574 flags.go:64] FLAG: --node-status-max-images="50" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519706 4574 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519748 4574 flags.go:64] FLAG: --oom-score-adj="-999" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519792 4574 flags.go:64] FLAG: --pod-cidr="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519841 4574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519889 4574 flags.go:64] FLAG: --pod-manifest-path="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519932 4574 flags.go:64] FLAG: --pod-max-pids="-1" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.519974 4574 flags.go:64] FLAG: --pods-per-core="0" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520017 4574 flags.go:64] FLAG: --port="10250" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520059 4574 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520101 4574 flags.go:64] FLAG: --provider-id="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520206 4574 flags.go:64] FLAG: --qos-reserved="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520287 4574 flags.go:64] FLAG: --read-only-port="10255" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520335 4574 flags.go:64] FLAG: --register-node="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520377 4574 flags.go:64] FLAG: --register-schedulable="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520420 4574 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520466 4574 flags.go:64] FLAG: --registry-burst="10" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520510 4574 flags.go:64] FLAG: --registry-qps="5" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520565 4574 flags.go:64] FLAG: --reserved-cpus="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520611 4574 flags.go:64] FLAG: --reserved-memory="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520657 4574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520717 4574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520782 4574 flags.go:64] FLAG: --rotate-certificates="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520827 4574 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520869 4574 flags.go:64] FLAG: --runonce="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520922 4574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.520968 4574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521011 4574 flags.go:64] FLAG: --seccomp-default="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521055 4574 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521097 4574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521147 4574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521206 4574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521287 4574 flags.go:64] FLAG: --storage-driver-password="root" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521336 4574 flags.go:64] FLAG: --storage-driver-secure="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521380 4574 flags.go:64] FLAG: --storage-driver-table="stats" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521422 4574 flags.go:64] FLAG: --storage-driver-user="root" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521465 4574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521511 4574 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521555 4574 flags.go:64] FLAG: --system-cgroups="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521597 4574 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521652 4574 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521696 4574 flags.go:64] FLAG: --tls-cert-file="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521744 4574 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521790 4574 flags.go:64] FLAG: --tls-min-version="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521832 4574 flags.go:64] FLAG: --tls-private-key-file="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521875 4574 flags.go:64] FLAG: --topology-manager-policy="none" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521920 4574 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.521968 4574 flags.go:64] FLAG: --topology-manager-scope="container" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.522021 4574 flags.go:64] FLAG: --v="2" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.522080 4574 flags.go:64] FLAG: --version="false" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.522134 4574 flags.go:64] FLAG: --vmodule="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.522180 4574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.522358 4574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522506 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522564 4574 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522609 4574 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522650 4574 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522691 4574 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522740 4574 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522789 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522845 4574 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522896 4574 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522939 4574 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.522980 4574 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523021 4574 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523062 4574 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523105 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523148 4574 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523196 4574 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523261 4574 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523306 4574 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523348 4574 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523390 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523432 4574 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523473 4574 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523523 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523567 4574 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523609 4574 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523650 4574 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523691 4574 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523732 4574 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523773 4574 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523822 4574 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523869 4574 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523928 4574 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.523987 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524049 4574 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524102 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524160 4574 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524219 4574 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524283 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524326 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524383 4574 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524456 4574 feature_gate.go:330] unrecognized feature gate: Example Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524505 4574 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524552 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524595 4574 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524636 4574 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524678 4574 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524721 4574 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524762 4574 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524804 4574 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524852 4574 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524896 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524938 4574 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.524980 4574 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525022 4574 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525068 4574 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525117 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525164 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525208 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525274 4574 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525321 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525370 4574 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525414 4574 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525456 4574 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525502 4574 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525545 4574 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525587 4574 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525628 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525670 4574 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525712 4574 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525754 4574 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.525801 4574 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.525853 4574 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.538490 4574 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.538526 4574 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538600 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538607 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538612 4574 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538616 4574 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538619 4574 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538623 4574 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538626 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538630 4574 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538634 4574 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538638 4574 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538643 4574 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538649 4574 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538653 4574 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538658 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538662 4574 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538666 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538670 4574 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538674 4574 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538679 4574 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538684 4574 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538688 4574 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538692 4574 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538697 4574 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538702 4574 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538709 4574 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538713 4574 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538717 4574 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538721 4574 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538725 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538729 4574 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538734 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538738 4574 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538741 4574 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538745 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538748 4574 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538752 4574 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538756 4574 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538759 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538763 4574 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538782 4574 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538785 4574 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538789 4574 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538792 4574 feature_gate.go:330] unrecognized feature gate: Example Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538796 4574 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538801 4574 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538806 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538810 4574 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538814 4574 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538817 4574 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538821 4574 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538825 4574 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538828 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538832 4574 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538836 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538840 4574 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538843 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538847 4574 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538850 4574 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538854 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538857 4574 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538861 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538866 4574 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538871 4574 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538875 4574 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538879 4574 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538883 4574 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538886 4574 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538890 4574 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538894 4574 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538898 4574 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.538901 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.538908 4574 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539026 4574 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539034 4574 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539038 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539041 4574 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539045 4574 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539049 4574 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539052 4574 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539056 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539060 4574 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539064 4574 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539067 4574 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539071 4574 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539074 4574 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539078 4574 feature_gate.go:330] unrecognized feature gate: Example Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539082 4574 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539085 4574 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539089 4574 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539093 4574 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539096 4574 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539101 4574 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539107 4574 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539111 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539115 4574 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539119 4574 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539123 4574 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539127 4574 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539131 4574 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539135 4574 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539138 4574 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539143 4574 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539147 4574 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539152 4574 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539156 4574 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539160 4574 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539165 4574 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539170 4574 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539174 4574 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539178 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539183 4574 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539187 4574 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539190 4574 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539194 4574 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539198 4574 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539202 4574 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539205 4574 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539208 4574 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539212 4574 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539216 4574 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539220 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539223 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539227 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539245 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539248 4574 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539252 4574 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539255 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539259 4574 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539263 4574 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539266 4574 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539270 4574 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539274 4574 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539278 4574 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539281 4574 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539285 4574 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539288 4574 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539291 4574 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539295 4574 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539298 4574 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539302 4574 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539305 4574 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539308 4574 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.539312 4574 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.539318 4574 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.539474 4574 server.go:940] "Client rotation is on, will bootstrap in background" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.544320 4574 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.544406 4574 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.545908 4574 server.go:997] "Starting client certificate rotation" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.545935 4574 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.546922 4574 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-10 19:24:48.782313524 +0000 UTC Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.547040 4574 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1622h38m34.235279262s for next certificate rotation Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.571064 4574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.573012 4574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.588081 4574 log.go:25] "Validated CRI v1 runtime API" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.627843 4574 log.go:25] "Validated CRI v1 image API" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.634281 4574 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.641499 4574 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-04-04-40-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.641528 4574 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.655322 4574 manager.go:217] Machine: {Timestamp:2025-10-04 04:46:14.652701023 +0000 UTC m=+0.506844085 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199472640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9757b487-9d09-40ae-a5ee-25ae49bc71e6 BootID:3b060499-a4fb-4547-9cda-a86b5d4fd2fa Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039894528 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076107 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599734272 Type:vfs Inodes:3076107 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ab:64:bf Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ab:64:bf Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:22:86:fa Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3b:69:96 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8c:e2:04 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:52:1e:bf Speed:-1 Mtu:1496} {Name:eth10 MacAddress:62:e4:5a:07:ee:fc Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a2:c8:3d:7d:71:15 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199472640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.655511 4574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.655743 4574 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.656085 4574 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.656264 4574 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.656294 4574 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.656465 4574 topology_manager.go:138] "Creating topology manager with none policy" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.656474 4574 container_manager_linux.go:303] "Creating device plugin manager" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.657207 4574 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.657261 4574 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.658356 4574 state_mem.go:36] "Initialized new in-memory state store" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.658436 4574 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.661879 4574 kubelet.go:418] "Attempting to sync node with API server" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.661902 4574 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.661943 4574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.661962 4574 kubelet.go:324] "Adding apiserver pod source" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.661973 4574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.667817 4574 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.669423 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.669488 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.669372 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.669542 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.669884 4574 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.672221 4574 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673636 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673658 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673688 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673695 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673705 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673714 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673720 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673731 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673739 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673747 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673763 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.673770 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.674537 4574 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.674968 4574 server.go:1280] "Started kubelet" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.675036 4574 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.675267 4574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.675711 4574 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.676136 4574 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:14 crc systemd[1]: Started Kubernetes Kubelet. Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677038 4574 server.go:460] "Adding debug handlers to kubelet server" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677156 4574 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677196 4574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677325 4574 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677482 4574 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677716 4574 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 21:56:38.16653147 +0000 UTC Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677752 4574 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1049h10m23.48878146s for next certificate rotation Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.677740 4574 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.677346 4574 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.678524 4574 factory.go:55] Registering systemd factory Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.678543 4574 factory.go:221] Registration of the systemd container factory successfully Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.678909 4574 factory.go:153] Registering CRI-O factory Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.679151 4574 factory.go:221] Registration of the crio container factory successfully Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.679183 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.684757 4574 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.684790 4574 factory.go:103] Registering Raw factory Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.684806 4574 manager.go:1196] Started watching for new ooms in manager Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.685300 4574 manager.go:319] Starting recovery of all containers Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.688014 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.689623 4574 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.150:6443: connect: connection refused" interval="200ms" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.688700 4574 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.150:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b304a33e92a05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-04 04:46:14.674942469 +0000 UTC m=+0.529085511,LastTimestamp:2025-10-04 04:46:14.674942469 +0000 UTC m=+0.529085511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696332 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696374 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696384 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696412 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696422 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696430 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696438 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696447 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696459 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696484 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696497 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696505 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696514 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696525 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696533 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696540 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696566 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696592 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696601 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696609 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696651 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696660 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696669 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696676 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696687 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696696 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696726 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696750 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696773 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696806 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696817 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696825 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696835 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696844 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.696853 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.699973 4574 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700025 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700063 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700074 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700099 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700117 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700127 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700137 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700145 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700154 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700178 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700188 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700197 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700207 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700218 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700250 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700262 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700271 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700284 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700293 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700303 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700331 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700343 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700354 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700364 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700372 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700382 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700406 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700417 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700426 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700435 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700444 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700453 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700463 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700493 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700503 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700511 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700521 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700530 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700539 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700565 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700576 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700586 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700594 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700603 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700612 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700620 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700646 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700655 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700663 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700671 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700679 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700687 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700698 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700725 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700736 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700744 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700753 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700762 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700770 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700778 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700848 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700859 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700868 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700877 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700911 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700920 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700929 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700937 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700950 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700962 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700971 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700981 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700990 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.700998 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701007 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701016 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701044 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701054 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701062 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701072 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701082 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701090 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701099 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701107 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701117 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701125 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701133 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701142 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701150 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701158 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701167 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701175 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701183 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701193 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701205 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701216 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701226 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701261 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701271 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701279 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701300 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701309 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701318 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701327 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701335 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701343 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701351 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701360 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701368 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701377 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701385 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701393 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701401 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701410 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701419 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701427 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701436 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701444 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701452 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701460 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701469 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701477 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701484 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701493 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701502 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701511 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701519 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701527 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701535 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701544 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701553 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701562 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701572 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701580 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701599 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701632 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701643 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701652 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701660 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701670 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701679 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701687 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701698 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701706 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701714 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701722 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701734 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701746 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701758 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701768 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701776 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701784 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701793 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701802 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701810 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701820 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701833 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701842 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701851 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701861 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701870 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701879 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701891 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701900 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701910 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701920 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701929 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701940 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701949 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701961 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701973 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701984 4574 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.701994 4574 reconstruct.go:97] "Volume reconstruction finished" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.702002 4574 reconciler.go:26] "Reconciler: start to sync state" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.702491 4574 manager.go:324] Recovery completed Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.710584 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.711692 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.711724 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.711733 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.712287 4574 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.712309 4574 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.712327 4574 state_mem.go:36] "Initialized new in-memory state store" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.724918 4574 policy_none.go:49] "None policy: Start" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.726799 4574 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.726836 4574 state_mem.go:35] "Initializing new in-memory state store" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.730523 4574 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.731875 4574 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.731913 4574 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.731938 4574 kubelet.go:2335] "Starting kubelet main sync loop" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.731978 4574 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 04 04:46:14 crc kubenswrapper[4574]: W1004 04:46:14.734321 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.734377 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.776126 4574 manager.go:334] "Starting Device Plugin manager" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.776384 4574 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.776411 4574 server.go:79] "Starting device plugin registration server" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.776789 4574 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.776807 4574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.776947 4574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.777042 4574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.777051 4574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.783762 4574 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.832204 4574 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.832286 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.833139 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.833175 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.833185 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.833358 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.833530 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.833585 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834193 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834214 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834226 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834256 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834272 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834260 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834392 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834566 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.834595 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.835172 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.835172 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.835227 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.835274 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.835203 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.835702 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.836720 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.836989 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.837101 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839293 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839320 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839332 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839375 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839399 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839459 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839689 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839731 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839972 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.839992 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.840001 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.840155 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.840179 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.841535 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.841564 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.841573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.843772 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.843798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.843809 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.877161 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.878106 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.878136 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.878147 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.878170 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.878666 4574 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.150:6443: connect: connection refused" node="crc" Oct 04 04:46:14 crc kubenswrapper[4574]: E1004 04:46:14.890260 4574 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.150:6443: connect: connection refused" interval="400ms" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904294 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904464 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904606 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904822 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904869 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904889 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904907 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904933 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904953 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904969 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.904985 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.905013 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.905049 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.905144 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:14 crc kubenswrapper[4574]: I1004 04:46:14.905169 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006833 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006884 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006900 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006916 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006931 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006945 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006959 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006972 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.006987 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007000 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007013 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007026 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007039 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007014 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007155 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007117 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007135 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007130 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007170 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007078 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007274 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007269 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007301 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007325 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007330 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007366 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007425 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007466 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.007601 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.079484 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.080883 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.080997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.081106 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.081262 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.081693 4574 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.150:6443: connect: connection refused" node="crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.157895 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.177407 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.185969 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.194966 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0c47f670937af54f6e7d572fecab00b026a24d7155dbe5251ba51b200b5f790f WatchSource:0}: Error finding container 0c47f670937af54f6e7d572fecab00b026a24d7155dbe5251ba51b200b5f790f: Status 404 returned error can't find the container with id 0c47f670937af54f6e7d572fecab00b026a24d7155dbe5251ba51b200b5f790f Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.208282 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.212325 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-fb6d910a1ff1912a6b94e1d5ac76df64f68d68d01e6233c54107958e5da88ef6 WatchSource:0}: Error finding container fb6d910a1ff1912a6b94e1d5ac76df64f68d68d01e6233c54107958e5da88ef6: Status 404 returned error can't find the container with id fb6d910a1ff1912a6b94e1d5ac76df64f68d68d01e6233c54107958e5da88ef6 Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.213072 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.216342 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f8f7410ea3ee64b65926f686509b580e60ab5b72a113826ff1573af7dd6e99a3 WatchSource:0}: Error finding container f8f7410ea3ee64b65926f686509b580e60ab5b72a113826ff1573af7dd6e99a3: Status 404 returned error can't find the container with id f8f7410ea3ee64b65926f686509b580e60ab5b72a113826ff1573af7dd6e99a3 Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.225548 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6ca3aa72d3441ca1b3e983c0f6766fce745ea88b63dc8563b41bad40090cb67c WatchSource:0}: Error finding container 6ca3aa72d3441ca1b3e983c0f6766fce745ea88b63dc8563b41bad40090cb67c: Status 404 returned error can't find the container with id 6ca3aa72d3441ca1b3e983c0f6766fce745ea88b63dc8563b41bad40090cb67c Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.230583 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1e7a3ab9ee6c8a8db6565689bdbb757087b0b01bc3ebd9a25fb8e98080f2676d WatchSource:0}: Error finding container 1e7a3ab9ee6c8a8db6565689bdbb757087b0b01bc3ebd9a25fb8e98080f2676d: Status 404 returned error can't find the container with id 1e7a3ab9ee6c8a8db6565689bdbb757087b0b01bc3ebd9a25fb8e98080f2676d Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.291454 4574 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.150:6443: connect: connection refused" interval="800ms" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.482050 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.483303 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.483335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.483346 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.483369 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.483752 4574 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.150:6443: connect: connection refused" node="crc" Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.644578 4574 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.150:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b304a33e92a05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-04 04:46:14.674942469 +0000 UTC m=+0.529085511,LastTimestamp:2025-10-04 04:46:14.674942469 +0000 UTC m=+0.529085511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.676744 4574 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.712295 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.712365 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.735759 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1e7a3ab9ee6c8a8db6565689bdbb757087b0b01bc3ebd9a25fb8e98080f2676d"} Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.737254 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ca3aa72d3441ca1b3e983c0f6766fce745ea88b63dc8563b41bad40090cb67c"} Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.738273 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27"} Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.738323 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f8f7410ea3ee64b65926f686509b580e60ab5b72a113826ff1573af7dd6e99a3"} Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.738413 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.739161 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.739190 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.739199 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.739949 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb6d910a1ff1912a6b94e1d5ac76df64f68d68d01e6233c54107958e5da88ef6"} Oct 04 04:46:15 crc kubenswrapper[4574]: I1004 04:46:15.741056 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0c47f670937af54f6e7d572fecab00b026a24d7155dbe5251ba51b200b5f790f"} Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.764468 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.764543 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:15 crc kubenswrapper[4574]: W1004 04:46:15.880601 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:15 crc kubenswrapper[4574]: E1004 04:46:15.880683 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:16 crc kubenswrapper[4574]: W1004 04:46:16.063054 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:16 crc kubenswrapper[4574]: E1004 04:46:16.063204 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:16 crc kubenswrapper[4574]: E1004 04:46:16.092924 4574 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.150:6443: connect: connection refused" interval="1.6s" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.284319 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.285387 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.285453 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.285481 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.285522 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:16 crc kubenswrapper[4574]: E1004 04:46:16.286228 4574 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.150:6443: connect: connection refused" node="crc" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.677168 4574 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.744751 4574 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873" exitCode=0 Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.744827 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.744834 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.745573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.745600 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.745608 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.746985 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.747484 4574 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="31a6b44b4e9de554d7c45f05cb9543944422740db3ce14f22975a29ebecc2d34" exitCode=0 Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.747610 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.747699 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.747725 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.747733 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.747994 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"31a6b44b4e9de554d7c45f05cb9543944422740db3ce14f22975a29ebecc2d34"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.748391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.748428 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.748443 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.750197 4574 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5" exitCode=0 Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.750282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.750364 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.751735 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.751758 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.751766 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.753038 4574 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27" exitCode=0 Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.753083 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.753144 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.760400 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.760450 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.760464 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.760480 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713"} Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.760613 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.762324 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.762352 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.762363 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.762988 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.763011 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:16 crc kubenswrapper[4574]: I1004 04:46:16.763020 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.676919 4574 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:17 crc kubenswrapper[4574]: E1004 04:46:17.693746 4574 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.150:6443: connect: connection refused" interval="3.2s" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.764548 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fb6d829e8e2b175da20467551561cc9334f2c5c5ab10997ce4c48b1ac04e99e3"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.764613 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.765603 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.765628 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.765639 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.766895 4574 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a" exitCode=0 Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.766950 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.767057 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.767944 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.767973 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.767982 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.770079 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.770117 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.770128 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.770248 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.771088 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.771128 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.771138 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.773909 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.773939 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.773953 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.773962 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.773971 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53"} Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.773980 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.774021 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.777143 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.777177 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.777186 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.777481 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.777501 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.777512 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: W1004 04:46:17.803570 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:17 crc kubenswrapper[4574]: E1004 04:46:17.803663 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.887782 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.889000 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.889038 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.889046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:17 crc kubenswrapper[4574]: I1004 04:46:17.889087 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:17 crc kubenswrapper[4574]: E1004 04:46:17.889526 4574 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.150:6443: connect: connection refused" node="crc" Oct 04 04:46:17 crc kubenswrapper[4574]: W1004 04:46:17.987088 4574 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.150:6443: connect: connection refused Oct 04 04:46:17 crc kubenswrapper[4574]: E1004 04:46:17.987162 4574 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.150:6443: connect: connection refused" logger="UnhandledError" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.674812 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.778418 4574 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673" exitCode=0 Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.778514 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673"} Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.778535 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.778607 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.778647 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.778527 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.779593 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.779621 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.779594 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.779645 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.779657 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.779632 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.780000 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.780027 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:18 crc kubenswrapper[4574]: I1004 04:46:18.780039 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.783808 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1"} Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.784073 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef"} Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.784084 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77"} Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.784097 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6"} Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.784106 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae"} Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.783858 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.783858 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.785144 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.785173 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.785183 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.785363 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.785393 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:19 crc kubenswrapper[4574]: I1004 04:46:19.785791 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.177549 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.177748 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.178759 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.178805 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.178818 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.786283 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.787031 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.787072 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:20 crc kubenswrapper[4574]: I1004 04:46:20.787084 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.090371 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.091553 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.091583 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.091595 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.091619 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.550079 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.550341 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.551616 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.551659 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:21 crc kubenswrapper[4574]: I1004 04:46:21.551701 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.701884 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.702050 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.703034 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.703071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.703080 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.921621 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.921791 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.922763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.922786 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:22 crc kubenswrapper[4574]: I1004 04:46:22.922794 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.028436 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.028604 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.029611 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.029640 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.029651 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.033456 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.792822 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.793608 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.793654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:23 crc kubenswrapper[4574]: I1004 04:46:23.793663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:24 crc kubenswrapper[4574]: E1004 04:46:24.783838 4574 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.702152 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.702296 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.703151 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.703183 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.703192 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.706837 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.796691 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.796893 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.799787 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.799839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.799861 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:25 crc kubenswrapper[4574]: I1004 04:46:25.801855 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:26 crc kubenswrapper[4574]: I1004 04:46:26.028824 4574 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 04:46:26 crc kubenswrapper[4574]: I1004 04:46:26.028911 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 04:46:26 crc kubenswrapper[4574]: I1004 04:46:26.798976 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:26 crc kubenswrapper[4574]: I1004 04:46:26.799743 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:26 crc kubenswrapper[4574]: I1004 04:46:26.799771 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:26 crc kubenswrapper[4574]: I1004 04:46:26.799780 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:27 crc kubenswrapper[4574]: I1004 04:46:27.800425 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:27 crc kubenswrapper[4574]: I1004 04:46:27.801070 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:27 crc kubenswrapper[4574]: I1004 04:46:27.801096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:27 crc kubenswrapper[4574]: I1004 04:46:27.801104 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:28 crc kubenswrapper[4574]: I1004 04:46:28.103720 4574 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 04 04:46:28 crc kubenswrapper[4574]: I1004 04:46:28.103773 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 04 04:46:28 crc kubenswrapper[4574]: I1004 04:46:28.128938 4574 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 04 04:46:28 crc kubenswrapper[4574]: I1004 04:46:28.129005 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 04 04:46:28 crc kubenswrapper[4574]: I1004 04:46:28.679414 4574 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]log ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]etcd ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/generic-apiserver-start-informers ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/priority-and-fairness-filter ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-apiextensions-informers ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-apiextensions-controllers ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/crd-informer-synced ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-system-namespaces-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 04 04:46:28 crc kubenswrapper[4574]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 04 04:46:28 crc kubenswrapper[4574]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/bootstrap-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/start-kube-aggregator-informers ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-registration-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-discovery-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]autoregister-completion ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-openapi-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 04 04:46:28 crc kubenswrapper[4574]: livez check failed Oct 04 04:46:28 crc kubenswrapper[4574]: I1004 04:46:28.679474 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:46:32 crc kubenswrapper[4574]: I1004 04:46:32.955081 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 04 04:46:32 crc kubenswrapper[4574]: I1004 04:46:32.955326 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:32 crc kubenswrapper[4574]: I1004 04:46:32.956764 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:32 crc kubenswrapper[4574]: I1004 04:46:32.956822 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:32 crc kubenswrapper[4574]: I1004 04:46:32.956835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:32 crc kubenswrapper[4574]: I1004 04:46:32.970714 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.088971 4574 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.092698 4574 trace.go:236] Trace[377410178]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 04:46:23.072) (total time: 10019ms): Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[377410178]: ---"Objects listed" error: 10019ms (04:46:33.092) Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[377410178]: [10.019732516s] [10.019732516s] END Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.092756 4574 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.093332 4574 trace.go:236] Trace[343585742]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 04:46:18.462) (total time: 14631ms): Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[343585742]: ---"Objects listed" error: 14631ms (04:46:33.093) Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[343585742]: [14.631191983s] [14.631191983s] END Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.093362 4574 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.094796 4574 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.095486 4574 trace.go:236] Trace[103595458]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 04:46:21.740) (total time: 11355ms): Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[103595458]: ---"Objects listed" error: 11355ms (04:46:33.095) Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[103595458]: [11.355416764s] [11.355416764s] END Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.095505 4574 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.095545 4574 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.095917 4574 trace.go:236] Trace[1633410129]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Oct-2025 04:46:18.483) (total time: 14611ms): Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[1633410129]: ---"Objects listed" error: 14611ms (04:46:33.095) Oct 04 04:46:33 crc kubenswrapper[4574]: Trace[1633410129]: [14.611883901s] [14.611883901s] END Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.095954 4574 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.131296 4574 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.131349 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.185163 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.189333 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.674196 4574 apiserver.go:52] "Watching apiserver" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.677057 4574 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.677396 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.677695 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.677741 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.677755 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.677695 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.677970 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.678163 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.678262 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.678282 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.678293 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.678727 4574 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680042 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680627 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680741 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680788 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680846 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680851 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680915 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.680901 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.681650 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.681694 4574 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.682326 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.684019 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.685055 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.695904 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698832 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698871 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698901 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698924 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698945 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698965 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.698987 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699010 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699029 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699052 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699076 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699098 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699118 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699138 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699157 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699180 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699245 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699272 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699295 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699354 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699383 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699406 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699442 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699467 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699489 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699512 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699533 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699556 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699582 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699604 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.699627 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700337 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700432 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700450 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700474 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700517 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700556 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700564 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700635 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700638 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700663 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700730 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700746 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700785 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700818 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700822 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700866 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700888 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700907 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700935 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701037 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701073 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701098 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701117 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701133 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701151 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701170 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701186 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701205 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701223 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701289 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701353 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701372 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701392 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701494 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701527 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701557 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701580 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701611 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701645 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701672 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701703 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701726 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701746 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701765 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701786 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701806 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701835 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701855 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701876 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701896 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701917 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701936 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701953 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701975 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701995 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702018 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702036 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702062 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702083 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702101 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702123 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702142 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702160 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702180 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702199 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702218 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702257 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702276 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702295 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702313 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702332 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702590 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702617 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702634 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702655 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702679 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702777 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702797 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702819 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702836 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702858 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702880 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702902 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702919 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702941 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703023 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703043 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703067 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703089 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703160 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703180 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703212 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703257 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703276 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703301 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703333 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703359 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703388 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703416 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703442 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703460 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703482 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703506 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703526 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703577 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703610 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703640 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703674 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703706 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703739 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703765 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703793 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703822 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703847 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703875 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703911 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703933 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703953 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703977 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703997 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704022 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704150 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704170 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704191 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704213 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704230 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704265 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704288 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704308 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704325 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704346 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704461 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704480 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704499 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704519 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704540 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704558 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704578 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704599 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704616 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704637 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704659 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704679 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704700 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704724 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704745 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704763 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704784 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704802 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704819 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704839 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704860 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704878 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704897 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704917 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704937 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704956 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704976 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704998 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705016 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705037 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705058 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705080 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705099 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705135 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705163 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705189 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705222 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705263 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705863 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705905 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705928 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705953 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705985 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706019 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.700933 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706046 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706096 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706109 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706144 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706177 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706434 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706461 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706486 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706513 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706614 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706638 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706661 4574 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706675 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706691 4574 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706709 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706723 4574 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.707224 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:46:34.207195218 +0000 UTC m=+20.061338260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.707509 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.707791 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.708054 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.708309 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.708580 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.708816 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.708845 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.709043 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.709103 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.709263 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.709280 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701544 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.709458 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.710163 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.710689 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.710865 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711154 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711438 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711533 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711564 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711719 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711795 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.711806 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.710065 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.712159 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.712222 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.712716 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.713341 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.713602 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.713785 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.714335 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.714375 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.714635 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.714828 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701128 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701376 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701614 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.701668 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702047 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702068 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702223 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702316 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702328 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702587 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.702856 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703017 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703029 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703129 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703477 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703942 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.703929 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704325 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704363 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704545 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704663 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.704989 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705140 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.705262 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.706026 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.715142 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.715155 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.715547 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:34.215520653 +0000 UTC m=+20.069663695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.715554 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.716065 4574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.718187 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.720405 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.720481 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:34.220464778 +0000 UTC m=+20.074607820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.721033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.721579 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.721830 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.722004 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.722431 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.722568 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.722700 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.723069 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.723402 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.723410 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.723726 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.723878 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.723997 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.724161 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.724376 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.724652 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.724803 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.724891 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.724964 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.725096 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:34.225071244 +0000 UTC m=+20.079214286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.724974 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.728646 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725318 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725363 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725588 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725647 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725811 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725887 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725963 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.724869 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.726412 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.728096 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.728320 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.728569 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.728717 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.728865 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.725389 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.729386 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.729728 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.730330 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.730523 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.730725 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.731145 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.731423 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.731576 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.731811 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.732163 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.732380 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.732544 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.733415 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.733480 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.733508 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.733595 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.733830 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.734034 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.734135 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.734483 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.734956 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.735080 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.735459 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.735616 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.735673 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.735705 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736029 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736035 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736193 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736413 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736429 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736597 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736717 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.736964 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.737137 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.737224 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.737263 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.737529 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.737793 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.738035 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.738263 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.739466 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.739817 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.739892 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.739919 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.739968 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.740165 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.740275 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.740387 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.740648 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.740721 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.740991 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741147 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741171 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741522 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741584 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741762 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741794 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741817 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.741950 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.742088 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.742129 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.742216 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.742495 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.742690 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.742641 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.743513 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.743724 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.743825 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.743876 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.743898 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.744420 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.744780 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.745074 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.745970 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.746025 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.746060 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.747007 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.744675 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.747514 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.747555 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.747765 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.748081 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.748401 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.748927 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.749046 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.749269 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.749408 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.749859 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.750212 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.750507 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.750567 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.752169 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.756204 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.757308 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.761148 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.761737 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.773843 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.776903 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.777916 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.784022 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.785528 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.785558 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.785570 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.785621 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:34.285603085 +0000 UTC m=+20.139746127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.786630 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.806365 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808219 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808347 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808447 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808468 4574 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808481 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808492 4574 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808508 4574 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808518 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808529 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808537 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808546 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808554 4574 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808562 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808573 4574 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808584 4574 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808595 4574 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808606 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808617 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808628 4574 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808638 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808649 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808657 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808665 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808673 4574 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808681 4574 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808688 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808697 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808706 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808734 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808747 4574 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808758 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808770 4574 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808781 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808793 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808803 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808813 4574 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808823 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808835 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808847 4574 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808860 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808871 4574 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808883 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808896 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808908 4574 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808919 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808929 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808939 4574 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808950 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808958 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808967 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808977 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808986 4574 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.808996 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809005 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809013 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809021 4574 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809030 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809040 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809049 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809059 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809067 4574 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809075 4574 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809084 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809092 4574 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809127 4574 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809135 4574 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809144 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809151 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809159 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809167 4574 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809175 4574 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809183 4574 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809191 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809204 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809212 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809219 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809249 4574 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809281 4574 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809301 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809311 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809321 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809330 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809341 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809349 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809363 4574 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809371 4574 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809379 4574 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809387 4574 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809397 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809406 4574 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809414 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809422 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809431 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809441 4574 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809450 4574 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809458 4574 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809467 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809475 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809484 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809492 4574 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809501 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809510 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809518 4574 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809527 4574 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809536 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809547 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809555 4574 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809566 4574 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809575 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809585 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809604 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809613 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809616 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809637 4574 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809648 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809658 4574 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809667 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809676 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809685 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809693 4574 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809702 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809710 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809718 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809727 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809735 4574 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809744 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809753 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809762 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809770 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809778 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809786 4574 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809796 4574 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809804 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809811 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809819 4574 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809828 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809836 4574 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809844 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809853 4574 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809861 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809869 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809877 4574 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809884 4574 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809894 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809906 4574 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809915 4574 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809922 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809931 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809939 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809947 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809956 4574 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809964 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809973 4574 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809980 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809989 4574 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.809998 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810007 4574 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810015 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810022 4574 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810030 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810039 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810047 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810057 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810065 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810073 4574 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810080 4574 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810089 4574 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810097 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810106 4574 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810114 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810121 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810129 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810137 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810145 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810152 4574 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810160 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810170 4574 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810178 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810186 4574 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810189 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810194 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810216 4574 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810226 4574 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810263 4574 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810272 4574 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810280 4574 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810288 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810296 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810306 4574 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810315 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.810322 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.820040 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.820649 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.821892 4574 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6" exitCode=255 Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.822083 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6"} Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.828463 4574 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: E1004 04:46:33.830803 4574 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.831055 4574 scope.go:117] "RemoveContainer" containerID="38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.835330 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.843764 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.852890 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.871499 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.883326 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.897873 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.907899 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.919517 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.932659 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.945814 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.957590 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.968036 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.977184 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.987882 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.992372 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 04 04:46:33 crc kubenswrapper[4574]: I1004 04:46:33.999664 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 04 04:46:34 crc kubenswrapper[4574]: W1004 04:46:34.005179 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a545182040bf884abefa64b1315a7f4c4c6a2657c01d94338d08e75145d76d5f WatchSource:0}: Error finding container a545182040bf884abefa64b1315a7f4c4c6a2657c01d94338d08e75145d76d5f: Status 404 returned error can't find the container with id a545182040bf884abefa64b1315a7f4c4c6a2657c01d94338d08e75145d76d5f Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.005325 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.018529 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.018970 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.030884 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.214078 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.214217 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:46:35.214201877 +0000 UTC m=+21.068344919 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.314722 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.314772 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.314798 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.314821 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.314946 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315002 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315012 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:35.314992403 +0000 UTC m=+21.169135445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315020 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315002 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315153 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:35.315124977 +0000 UTC m=+21.169268019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315034 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315248 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:35.31522722 +0000 UTC m=+21.169370262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.314953 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315290 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315303 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:34 crc kubenswrapper[4574]: E1004 04:46:34.315332 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:35.315325203 +0000 UTC m=+21.169468245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.736006 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.736829 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.738218 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.739202 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.740573 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.741104 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.741714 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.743117 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.743770 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.744785 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.745282 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.746376 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.746827 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.747369 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.748445 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.748929 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.749859 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.750253 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.750822 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.751810 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.752285 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.754396 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.754448 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.754815 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.756514 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.757025 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.757658 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.758712 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.759171 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.761617 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.762415 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.763025 4574 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.763215 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.765106 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.765754 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.766164 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.769040 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.769750 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.770647 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.771288 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.772293 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.772811 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.773837 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.774560 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.775477 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.775972 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.776906 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.777425 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.777822 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.778484 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.778913 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.780286 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.781051 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.782049 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.784193 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.786303 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.796560 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.809115 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.826560 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.828610 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.829164 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.829464 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f07f5c047018d9ef817b794491fc8f677ac6242a5956cd98dad71b0475ea5994"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.831438 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.831462 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.831472 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"38e1ebf680943467ae5764dff94df795fe2b1e27c07bd213e819c3f864616684"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.832098 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.832522 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.832554 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a545182040bf884abefa64b1315a7f4c4c6a2657c01d94338d08e75145d76d5f"} Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.844738 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.861050 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.874346 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.888824 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.912170 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.932519 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.950980 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.971564 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:34 crc kubenswrapper[4574]: I1004 04:46:34.998783 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.018670 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:35Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.033319 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:35Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.048528 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:35Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.066453 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:35Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.222756 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.222998 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:46:37.22296402 +0000 UTC m=+23.077107062 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.324091 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.324130 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.324148 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.324176 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324280 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324307 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324312 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324323 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324368 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324378 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324354 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:37.324338594 +0000 UTC m=+23.178481626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324406 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324417 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324421 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:37.324409286 +0000 UTC m=+23.178552398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324439 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:37.324432036 +0000 UTC m=+23.178575168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.324463 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:37.324449067 +0000 UTC m=+23.178592099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.732799 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.732799 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:35 crc kubenswrapper[4574]: I1004 04:46:35.732830 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.733799 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.733847 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:35 crc kubenswrapper[4574]: E1004 04:46:35.733666 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:36 crc kubenswrapper[4574]: I1004 04:46:36.837486 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41"} Oct 04 04:46:36 crc kubenswrapper[4574]: I1004 04:46:36.853585 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:36Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:36 crc kubenswrapper[4574]: I1004 04:46:36.874216 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:36Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:36 crc kubenswrapper[4574]: I1004 04:46:36.906110 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:36Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:36 crc kubenswrapper[4574]: I1004 04:46:36.951449 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:36Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:36 crc kubenswrapper[4574]: I1004 04:46:36.984035 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:36Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.025930 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.092080 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dmzfp"] Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.092385 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.095148 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.095153 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.095326 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.103281 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.124932 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.136638 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96222110-95c8-4caa-b42a-7526e39ae0e7-hosts-file\") pod \"node-resolver-dmzfp\" (UID: \"96222110-95c8-4caa-b42a-7526e39ae0e7\") " pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.136865 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgkv\" (UniqueName: \"kubernetes.io/projected/96222110-95c8-4caa-b42a-7526e39ae0e7-kube-api-access-vtgkv\") pod \"node-resolver-dmzfp\" (UID: \"96222110-95c8-4caa-b42a-7526e39ae0e7\") " pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.154075 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.184973 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.196650 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.216224 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.230439 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.237397 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.237536 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:46:41.237514999 +0000 UTC m=+27.091658051 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.237784 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgkv\" (UniqueName: \"kubernetes.io/projected/96222110-95c8-4caa-b42a-7526e39ae0e7-kube-api-access-vtgkv\") pod \"node-resolver-dmzfp\" (UID: \"96222110-95c8-4caa-b42a-7526e39ae0e7\") " pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.237926 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96222110-95c8-4caa-b42a-7526e39ae0e7-hosts-file\") pod \"node-resolver-dmzfp\" (UID: \"96222110-95c8-4caa-b42a-7526e39ae0e7\") " pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.238089 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96222110-95c8-4caa-b42a-7526e39ae0e7-hosts-file\") pod \"node-resolver-dmzfp\" (UID: \"96222110-95c8-4caa-b42a-7526e39ae0e7\") " pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.261534 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.281061 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgkv\" (UniqueName: \"kubernetes.io/projected/96222110-95c8-4caa-b42a-7526e39ae0e7-kube-api-access-vtgkv\") pod \"node-resolver-dmzfp\" (UID: \"96222110-95c8-4caa-b42a-7526e39ae0e7\") " pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.288762 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.301963 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.322954 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.334204 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.338442 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338612 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.338635 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338715 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:41.338688446 +0000 UTC m=+27.192831548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.338809 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.338843 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338953 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338976 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338987 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338953 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.339031 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:41.339018086 +0000 UTC m=+27.193161128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.339042 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.339054 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.339088 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:41.339075418 +0000 UTC m=+27.193218520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.338955 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.339128 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:41.339120009 +0000 UTC m=+27.193263141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.360610 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.406210 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dmzfp" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.732669 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.732689 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.732669 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.732794 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.732882 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:37 crc kubenswrapper[4574]: E1004 04:46:37.732950 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.840654 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dmzfp" event={"ID":"96222110-95c8-4caa-b42a-7526e39ae0e7","Type":"ContainerStarted","Data":"6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb"} Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.840708 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dmzfp" event={"ID":"96222110-95c8-4caa-b42a-7526e39ae0e7","Type":"ContainerStarted","Data":"436aaad5f0619ed5e7d8627f73fc654d244b7bd54fe316a53d0a1e16f0cba14f"} Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.856059 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.867368 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.886163 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.893039 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6wsfn"] Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.893775 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-b9dlv"] Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.893972 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.894827 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wl5xt"] Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.895042 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.895270 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.898488 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.898549 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.898995 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899032 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899256 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899351 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899393 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899463 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899519 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.899561 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.900670 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.903470 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.916267 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.934317 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.945204 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-cnibin\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.945547 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-cni-bin\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.945665 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-etc-kubernetes\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.945779 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-system-cni-dir\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.945873 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-k8s-cni-cncf-io\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.945961 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-conf-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946084 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75910bdc-1940-4d15-b390-4bcfcec9f72c-proxy-tls\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946212 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-socket-dir-parent\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946356 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cnibin\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946474 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-multus-certs\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946624 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946755 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/649982aa-c9c5-41ce-a056-48ad058e9aa5-cni-binary-copy\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946876 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/75910bdc-1940-4d15-b390-4bcfcec9f72c-rootfs\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.946991 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lclg\" (UniqueName: \"kubernetes.io/projected/75910bdc-1940-4d15-b390-4bcfcec9f72c-kube-api-access-6lclg\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947108 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947216 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947329 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-kubelet\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947456 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgjqq\" (UniqueName: \"kubernetes.io/projected/4d896311-2a08-4a70-b74e-2a9b10abc7ae-kube-api-access-jgjqq\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947586 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs2cd\" (UniqueName: \"kubernetes.io/projected/649982aa-c9c5-41ce-a056-48ad058e9aa5-kube-api-access-qs2cd\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947703 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75910bdc-1940-4d15-b390-4bcfcec9f72c-mcd-auth-proxy-config\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947821 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-os-release\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.947925 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-netns\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948043 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-hostroot\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948297 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-os-release\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948453 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-cni-multus\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948582 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-system-cni-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948712 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948758 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-cni-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.948923 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-daemon-config\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.961301 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.975836 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:37 crc kubenswrapper[4574]: I1004 04:46:37.988034 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.000793 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:37Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.013684 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.024447 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.042166 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050039 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgjqq\" (UniqueName: \"kubernetes.io/projected/4d896311-2a08-4a70-b74e-2a9b10abc7ae-kube-api-access-jgjqq\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050328 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-kubelet\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050442 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-netns\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050553 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs2cd\" (UniqueName: \"kubernetes.io/projected/649982aa-c9c5-41ce-a056-48ad058e9aa5-kube-api-access-qs2cd\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050646 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75910bdc-1940-4d15-b390-4bcfcec9f72c-mcd-auth-proxy-config\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050743 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-os-release\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050833 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-hostroot\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050917 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-os-release\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050526 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-netns\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051028 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-os-release\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050388 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-kubelet\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051013 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-cni-multus\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.050949 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-hostroot\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051018 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-os-release\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051082 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-system-cni-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051110 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-daemon-config\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051133 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-cni-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051147 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-system-cni-dir\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051163 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-cnibin\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051177 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-cni-bin\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051191 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-etc-kubernetes\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051205 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-k8s-cni-cncf-io\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051212 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-system-cni-dir\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051210 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-system-cni-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051220 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75910bdc-1940-4d15-b390-4bcfcec9f72c-proxy-tls\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051284 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-cnibin\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051286 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-cni-bin\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051312 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-conf-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051324 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-etc-kubernetes\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051339 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-conf-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051348 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75910bdc-1940-4d15-b390-4bcfcec9f72c-mcd-auth-proxy-config\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051373 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-socket-dir-parent\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051401 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-multus-certs\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051410 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-socket-dir-parent\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-k8s-cni-cncf-io\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051346 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-cni-dir\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051444 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cnibin\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051467 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051476 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cnibin\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051485 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051532 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/649982aa-c9c5-41ce-a056-48ad058e9aa5-cni-binary-copy\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051465 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-run-multus-certs\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051557 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/75910bdc-1940-4d15-b390-4bcfcec9f72c-rootfs\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051582 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lclg\" (UniqueName: \"kubernetes.io/projected/75910bdc-1940-4d15-b390-4bcfcec9f72c-kube-api-access-6lclg\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051607 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051659 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/75910bdc-1940-4d15-b390-4bcfcec9f72c-rootfs\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.051892 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/649982aa-c9c5-41ce-a056-48ad058e9aa5-multus-daemon-config\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.052005 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.052168 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/649982aa-c9c5-41ce-a056-48ad058e9aa5-host-var-lib-cni-multus\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.052177 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/649982aa-c9c5-41ce-a056-48ad058e9aa5-cni-binary-copy\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.052306 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d896311-2a08-4a70-b74e-2a9b10abc7ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.052310 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d896311-2a08-4a70-b74e-2a9b10abc7ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.055984 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75910bdc-1940-4d15-b390-4bcfcec9f72c-proxy-tls\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.057674 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.066398 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs2cd\" (UniqueName: \"kubernetes.io/projected/649982aa-c9c5-41ce-a056-48ad058e9aa5-kube-api-access-qs2cd\") pod \"multus-6wsfn\" (UID: \"649982aa-c9c5-41ce-a056-48ad058e9aa5\") " pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.071857 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgjqq\" (UniqueName: \"kubernetes.io/projected/4d896311-2a08-4a70-b74e-2a9b10abc7ae-kube-api-access-jgjqq\") pod \"multus-additional-cni-plugins-b9dlv\" (UID: \"4d896311-2a08-4a70-b74e-2a9b10abc7ae\") " pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.072824 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.073324 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lclg\" (UniqueName: \"kubernetes.io/projected/75910bdc-1940-4d15-b390-4bcfcec9f72c-kube-api-access-6lclg\") pod \"machine-config-daemon-wl5xt\" (UID: \"75910bdc-1940-4d15-b390-4bcfcec9f72c\") " pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.082832 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.091740 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.103304 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.115461 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.125129 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.137748 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.156831 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.170653 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.212010 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6wsfn" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.223304 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" Oct 04 04:46:38 crc kubenswrapper[4574]: W1004 04:46:38.224683 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod649982aa_c9c5_41ce_a056_48ad058e9aa5.slice/crio-4f09f67492dc72b47ba2add823105deab6840d48ed1737b9464531d8297950ac WatchSource:0}: Error finding container 4f09f67492dc72b47ba2add823105deab6840d48ed1737b9464531d8297950ac: Status 404 returned error can't find the container with id 4f09f67492dc72b47ba2add823105deab6840d48ed1737b9464531d8297950ac Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.230748 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:46:38 crc kubenswrapper[4574]: W1004 04:46:38.255076 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75910bdc_1940_4d15_b390_4bcfcec9f72c.slice/crio-8386f9a39e12fe06a81d6349aa73c6ff4145fcb8c96e8b5f9394e8fe8d1b48f4 WatchSource:0}: Error finding container 8386f9a39e12fe06a81d6349aa73c6ff4145fcb8c96e8b5f9394e8fe8d1b48f4: Status 404 returned error can't find the container with id 8386f9a39e12fe06a81d6349aa73c6ff4145fcb8c96e8b5f9394e8fe8d1b48f4 Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.282277 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ntdng"] Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.282973 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.307328 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.307630 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.308168 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.309157 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.309289 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.309374 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.312513 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.337536 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354823 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-env-overrides\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354861 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znkbp\" (UniqueName: \"kubernetes.io/projected/e473790c-4fad-4637-9d72-0dd6310b4ae0-kube-api-access-znkbp\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354900 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354922 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-systemd\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354939 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-node-log\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354952 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-script-lib\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354966 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-systemd-units\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354978 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-ovn\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.354991 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-log-socket\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355005 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-etc-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355023 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-bin\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355038 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-var-lib-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355051 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355084 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-slash\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355099 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355117 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355133 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-config\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355148 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-netd\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355168 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-kubelet\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.355262 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-netns\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.391968 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.426452 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455705 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-systemd-units\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455739 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-ovn\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455756 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-etc-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455772 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-log-socket\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455787 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-bin\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455802 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-var-lib-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455815 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455842 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-slash\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455857 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455870 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455888 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-config\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455901 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-kubelet\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455917 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-netd\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455935 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-netns\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455949 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-env-overrides\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455969 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znkbp\" (UniqueName: \"kubernetes.io/projected/e473790c-4fad-4637-9d72-0dd6310b4ae0-kube-api-access-znkbp\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.455986 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456000 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-systemd\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456015 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-node-log\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456028 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-script-lib\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456619 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456651 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456681 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-slash\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456701 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456702 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-bin\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456707 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-var-lib-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456781 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-netd\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456798 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-node-log\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456836 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-ovn\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456856 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-netns\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456868 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-script-lib\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456883 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-log-socket\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456896 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-systemd\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456923 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-systemd-units\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456943 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-etc-openvswitch\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.456961 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-kubelet\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.457201 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-env-overrides\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.457363 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-config\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.460970 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.462114 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.476498 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.479587 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znkbp\" (UniqueName: \"kubernetes.io/projected/e473790c-4fad-4637-9d72-0dd6310b4ae0-kube-api-access-znkbp\") pod \"ovnkube-node-ntdng\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.491614 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.503975 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.516753 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.528086 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.538299 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.549097 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.561603 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.572368 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.592176 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.644520 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:38 crc kubenswrapper[4574]: W1004 04:46:38.681956 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode473790c_4fad_4637_9d72_0dd6310b4ae0.slice/crio-440c0c35e5a6ea9dfecf337d50ad9e8c4f7c6fa3d2f43e20aff370d355af9bee WatchSource:0}: Error finding container 440c0c35e5a6ea9dfecf337d50ad9e8c4f7c6fa3d2f43e20aff370d355af9bee: Status 404 returned error can't find the container with id 440c0c35e5a6ea9dfecf337d50ad9e8c4f7c6fa3d2f43e20aff370d355af9bee Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.846185 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerStarted","Data":"5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.846257 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerStarted","Data":"a4e5aa99aec4b29a21974e7fa8ee8130ed3f5f9155b53e131250dd06a496ecb4"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.848080 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerStarted","Data":"c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.848116 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerStarted","Data":"4f09f67492dc72b47ba2add823105deab6840d48ed1737b9464531d8297950ac"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.851055 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.851088 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"8386f9a39e12fe06a81d6349aa73c6ff4145fcb8c96e8b5f9394e8fe8d1b48f4"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.852566 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d" exitCode=0 Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.852597 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.852617 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"440c0c35e5a6ea9dfecf337d50ad9e8c4f7c6fa3d2f43e20aff370d355af9bee"} Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.862456 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.874436 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.888910 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.913001 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.928979 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.944433 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.957415 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.970576 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.985827 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:38 crc kubenswrapper[4574]: I1004 04:46:38.999320 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:38Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.011472 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.023454 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.046658 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.063534 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.082275 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.098967 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.108802 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.124086 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.137435 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.150602 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.165749 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.184057 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.196760 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.212884 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.223396 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.238588 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.257934 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.270979 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.495440 4574 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.497055 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.497209 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.497298 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.497446 4574 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.503920 4574 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.504204 4574 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.505569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.505664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.505743 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.505811 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.505873 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.523520 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.527295 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.527322 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.527331 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.527345 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.527354 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.538558 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.541438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.541472 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.541481 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.541496 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.541505 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.551385 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.554071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.554099 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.554107 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.554119 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.554128 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.565484 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.568530 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.568560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.568569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.568583 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.568593 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.580301 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.580414 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.582210 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.582332 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.582417 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.582517 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.582608 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.685158 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.685441 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.685541 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.685637 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.685714 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.732315 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.732315 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.732853 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.732736 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.732388 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:39 crc kubenswrapper[4574]: E1004 04:46:39.732933 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.787589 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.787624 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.787635 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.787651 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.787662 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.856724 4574 generic.go:334] "Generic (PLEG): container finished" podID="4d896311-2a08-4a70-b74e-2a9b10abc7ae" containerID="5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5" exitCode=0 Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.856792 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerDied","Data":"5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.859650 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.862647 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.862680 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.862694 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.862706 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.862717 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.862743 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.877717 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.889186 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.889640 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.889663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.889671 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.889683 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.889693 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.898505 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.912483 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.930530 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.944553 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.958544 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.973853 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.985411 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.996156 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.996192 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.996217 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.996257 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.996271 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:39Z","lastTransitionTime":"2025-10-04T04:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:39 crc kubenswrapper[4574]: I1004 04:46:39.998777 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:39Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.011020 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.022317 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.036685 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.048394 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.063415 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.075784 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.086617 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.098209 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.098299 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.098405 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.098414 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.098428 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.098437 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.116713 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.131020 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.144419 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.155836 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.169830 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.187420 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.195128 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6mcbn"] Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.195576 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.197100 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.197260 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.198219 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.198587 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.200328 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.200358 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.200367 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.200382 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.200391 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.202712 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.214162 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.225887 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.239962 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.256175 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.267987 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.271017 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2zq\" (UniqueName: \"kubernetes.io/projected/17493fd3-2995-469d-bd5a-5158f2866895-kube-api-access-7l2zq\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.271086 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17493fd3-2995-469d-bd5a-5158f2866895-serviceca\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.271122 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17493fd3-2995-469d-bd5a-5158f2866895-host\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.287396 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.302631 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.302861 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.302868 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.302883 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.302891 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.315723 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.352331 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.372283 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17493fd3-2995-469d-bd5a-5158f2866895-serviceca\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.372338 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17493fd3-2995-469d-bd5a-5158f2866895-host\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.372371 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2zq\" (UniqueName: \"kubernetes.io/projected/17493fd3-2995-469d-bd5a-5158f2866895-kube-api-access-7l2zq\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.373081 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17493fd3-2995-469d-bd5a-5158f2866895-host\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.373610 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/17493fd3-2995-469d-bd5a-5158f2866895-serviceca\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.391280 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.405060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.405096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.405105 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.405119 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.405127 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.418423 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2zq\" (UniqueName: \"kubernetes.io/projected/17493fd3-2995-469d-bd5a-5158f2866895-kube-api-access-7l2zq\") pod \"node-ca-6mcbn\" (UID: \"17493fd3-2995-469d-bd5a-5158f2866895\") " pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.450996 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.491225 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.508057 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.508087 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.508096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.508112 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.508122 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.531732 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.572342 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.610649 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.610888 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.610985 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.611078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.611171 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.614423 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.654866 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.705332 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6mcbn" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.712736 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.712976 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.713091 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.713203 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.713310 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.724052 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.754950 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.782873 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.816302 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.816342 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.816352 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.816371 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.816382 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.866049 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6mcbn" event={"ID":"17493fd3-2995-469d-bd5a-5158f2866895","Type":"ContainerStarted","Data":"be3f5944cf89639eb26f96c3c3551d9a6bc4b83253d9d9edf86c2411fda722a8"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.867908 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerStarted","Data":"a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.882764 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.893257 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.910526 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.918817 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.918849 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.918859 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.918875 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.918885 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:40Z","lastTransitionTime":"2025-10-04T04:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.937206 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:40 crc kubenswrapper[4574]: I1004 04:46:40.972463 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.013355 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.022558 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.022655 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.022670 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.022696 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.022717 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.052553 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.089600 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.124701 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.124747 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.124761 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.124777 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.124805 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.136715 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.171436 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.208671 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.226648 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.226680 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.226689 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.226703 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.226712 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.252627 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.281584 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.281790 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:46:49.281764873 +0000 UTC m=+35.135907945 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.296786 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.328808 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.328844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.328853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.328870 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.328879 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.332747 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.370438 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.382892 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.382938 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.382958 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.382983 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383079 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383118 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383144 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383148 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383157 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383101 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383195 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:49.383171987 +0000 UTC m=+35.237315089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383203 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383213 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383215 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:49.383206438 +0000 UTC m=+35.237349580 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383285 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:49.38327094 +0000 UTC m=+35.237413982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.383296 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:49.383290771 +0000 UTC m=+35.237433813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.431260 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.431298 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.431315 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.431333 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.431342 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.534909 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.535271 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.535286 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.535302 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.535317 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.638209 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.638265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.638278 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.638294 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.638306 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.732358 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.732429 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.732362 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.732505 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.732660 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:41 crc kubenswrapper[4574]: E1004 04:46:41.732720 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.740427 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.740460 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.740469 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.740485 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.740494 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.842378 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.842417 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.842430 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.842447 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.842460 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.873056 4574 generic.go:334] "Generic (PLEG): container finished" podID="4d896311-2a08-4a70-b74e-2a9b10abc7ae" containerID="a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475" exitCode=0 Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.873157 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerDied","Data":"a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.874474 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6mcbn" event={"ID":"17493fd3-2995-469d-bd5a-5158f2866895","Type":"ContainerStarted","Data":"0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.882667 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.890674 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.906781 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.921727 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.934085 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.944774 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.944813 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.944823 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.944840 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.944851 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:41Z","lastTransitionTime":"2025-10-04T04:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.952006 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.966186 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.978526 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:41 crc kubenswrapper[4574]: I1004 04:46:41.994814 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:41Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.014586 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.026940 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.037967 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.048604 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.048644 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.048655 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.048673 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.048685 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.052589 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.101708 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.115994 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.126455 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.137021 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.147997 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.150881 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.150909 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.150918 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.150932 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.150941 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.163786 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.176211 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.189184 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.211056 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.253147 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.253169 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.253121 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.253177 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.253260 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.253271 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.292052 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.336870 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.355719 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.355746 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.355754 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.355767 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.355776 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.371615 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.409040 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.459364 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.459510 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.459524 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.459531 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.459543 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.459552 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.488631 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.533104 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.561925 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.561957 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.561965 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.561981 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.561990 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.575815 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.664433 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.664464 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.664473 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.664487 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.664496 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.766307 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.766349 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.766359 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.766374 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.766384 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.868605 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.868642 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.868652 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.868668 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.868679 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.887047 4574 generic.go:334] "Generic (PLEG): container finished" podID="4d896311-2a08-4a70-b74e-2a9b10abc7ae" containerID="612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12" exitCode=0 Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.887117 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerDied","Data":"612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.907166 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.917408 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.931015 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.948028 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.959930 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.971072 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.971135 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.971145 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.971160 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.971168 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:42Z","lastTransitionTime":"2025-10-04T04:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.972067 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.983711 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:42 crc kubenswrapper[4574]: I1004 04:46:42.994023 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:42Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.009447 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.021499 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.032014 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.054564 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.072666 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.072700 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.072711 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.072726 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.072738 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.098580 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.131008 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.169227 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.174751 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.174788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.174800 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.174818 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.174829 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.277099 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.277138 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.277149 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.277164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.277173 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.379969 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.380013 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.380030 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.380054 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.380066 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.482104 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.482151 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.482163 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.482178 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.482187 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.584731 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.584765 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.584774 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.584790 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.584810 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.687324 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.687371 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.687386 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.687404 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.687415 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.733192 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.733244 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:43 crc kubenswrapper[4574]: E1004 04:46:43.733680 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.733295 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:43 crc kubenswrapper[4574]: E1004 04:46:43.733765 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:43 crc kubenswrapper[4574]: E1004 04:46:43.733805 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.789084 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.789124 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.789132 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.789146 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.789173 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.890760 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.890796 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.890806 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.890822 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.890831 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.894783 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.895036 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.895064 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.901775 4574 generic.go:334] "Generic (PLEG): container finished" podID="4d896311-2a08-4a70-b74e-2a9b10abc7ae" containerID="801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4" exitCode=0 Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.901964 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerDied","Data":"801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4"} Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.912116 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.922799 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.934056 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.934251 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.936674 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.948635 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.960615 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.979154 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.992617 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:43Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.993321 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.993355 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.993365 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.993380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:43 crc kubenswrapper[4574]: I1004 04:46:43.993389 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:43Z","lastTransitionTime":"2025-10-04T04:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.005518 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.020457 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.038566 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.050694 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.062749 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.075640 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.086708 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.095355 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.095382 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.095390 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.095409 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.095418 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.098705 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.109101 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.121299 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.132496 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.145087 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.157968 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.170554 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.182214 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.195386 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.197092 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.197129 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.197139 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.197157 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.197166 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.212856 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.224938 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.236664 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.255492 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.294751 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.299652 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.299697 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.299707 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.299728 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.299740 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.331377 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.372936 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.401506 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.401543 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.401552 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.401566 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.401575 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.504015 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.504071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.504082 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.504104 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.504115 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.606385 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.606438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.606450 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.606469 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.606479 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.708729 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.708761 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.708769 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.708782 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.708792 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.757318 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.769431 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.778014 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.801550 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.810663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.810695 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.810706 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.810724 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.810735 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.813525 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.824685 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.838647 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.850002 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.860691 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.870650 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.881163 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.892446 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.902522 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.906630 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.907265 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerStarted","Data":"1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.912653 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.912674 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.912705 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.912723 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.912734 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:44Z","lastTransitionTime":"2025-10-04T04:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.929338 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:44 crc kubenswrapper[4574]: I1004 04:46:44.971825 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:44Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.014854 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.014893 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.014903 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.014919 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.014929 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.116774 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.117048 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.117057 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.117070 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.117079 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.219083 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.219147 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.219155 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.219182 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.219199 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.321599 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.321646 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.321657 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.321673 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.321684 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.423389 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.423428 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.423438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.423454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.423468 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.525848 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.525885 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.525894 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.525910 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.525919 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.628270 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.628347 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.628359 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.628375 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.628386 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.730432 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.730467 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.730478 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.730508 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.730522 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.732811 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.732894 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:45 crc kubenswrapper[4574]: E1004 04:46:45.732913 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.732811 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:45 crc kubenswrapper[4574]: E1004 04:46:45.733027 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:45 crc kubenswrapper[4574]: E1004 04:46:45.733071 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.832600 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.832644 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.832655 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.832670 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.832680 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.911913 4574 generic.go:334] "Generic (PLEG): container finished" podID="4d896311-2a08-4a70-b74e-2a9b10abc7ae" containerID="1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee" exitCode=0 Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.912050 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.912034 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerDied","Data":"1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.934692 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:45Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.935500 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.935529 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.935538 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.935552 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.935561 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:45Z","lastTransitionTime":"2025-10-04T04:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.949796 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:45Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.960806 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:45Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.974940 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:45Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:45 crc kubenswrapper[4574]: I1004 04:46:45.984927 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:45Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.000053 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:45Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.017902 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.031269 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.037515 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.037546 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.037557 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.037572 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.037582 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.040871 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.053155 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.064188 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.077861 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.089504 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.100949 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.112867 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.139483 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.139518 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.139527 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.139543 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.139595 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.210936 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.230784 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.243103 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.243137 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.243147 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.243164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.243176 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.244353 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.256210 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.275151 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.288449 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.309881 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.334803 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.346164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.346215 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.346227 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.346261 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.346275 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.349602 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.363167 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.376943 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.391514 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.404491 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.417838 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.431335 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.443442 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.449381 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.449412 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.449419 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.449434 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.449444 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.552631 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.552723 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.552747 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.552783 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.552808 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.655664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.655708 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.655717 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.655732 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.655742 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.757619 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.757649 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.757659 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.757673 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.757681 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.860026 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.860071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.860080 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.860095 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.860107 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.918114 4574 generic.go:334] "Generic (PLEG): container finished" podID="4d896311-2a08-4a70-b74e-2a9b10abc7ae" containerID="b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c" exitCode=0 Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.918161 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerDied","Data":"b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.942558 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.955094 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.962086 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.962150 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.962165 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.962183 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.962192 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:46Z","lastTransitionTime":"2025-10-04T04:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.965120 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.979039 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:46 crc kubenswrapper[4574]: I1004 04:46:46.989974 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:46Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.005548 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.023806 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.035514 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.046289 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.058046 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.064438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.064478 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.064489 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.064514 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.064526 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.067637 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.082536 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.094975 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.112202 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.128111 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.166359 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.166406 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.166420 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.166436 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.166450 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.269030 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.269347 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.269358 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.269373 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.269385 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.371415 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.371556 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.371568 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.371585 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.371597 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.473817 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.473877 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.473888 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.473906 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.473917 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.575729 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.575762 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.575772 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.575788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.575801 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.677819 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.677847 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.677856 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.677872 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.677881 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.732136 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.732199 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:47 crc kubenswrapper[4574]: E1004 04:46:47.732269 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.732139 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:47 crc kubenswrapper[4574]: E1004 04:46:47.732337 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:47 crc kubenswrapper[4574]: E1004 04:46:47.732472 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.784051 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.784107 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.784117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.784132 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.784145 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.886618 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.886656 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.886664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.886677 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.886685 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.923111 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/0.log" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.925761 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151" exitCode=1 Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.925821 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.926514 4574 scope.go:117] "RemoveContainer" containerID="346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.929281 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" event={"ID":"4d896311-2a08-4a70-b74e-2a9b10abc7ae","Type":"ContainerStarted","Data":"1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.939001 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.955650 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.966697 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.978307 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.988967 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.989011 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.989020 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.989035 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.989044 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:47Z","lastTransitionTime":"2025-10-04T04:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:47 crc kubenswrapper[4574]: I1004 04:46:47.990449 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:47Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.003784 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.015216 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.026932 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.048698 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.064589 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.073655 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.086219 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.091368 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.091393 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.091404 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.091423 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.091433 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.099780 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.118632 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.137226 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1004 04:46:47.309646 5728 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309684 5728 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309799 5728 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310063 5728 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310347 5728 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310539 5728 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310577 5728 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 04:46:47.310834 5728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 04:46:47.310883 5728 factory.go:656] Stopping watch factory\\\\nI1004 04:46:47.310920 5728 ovnkube.go:599] Stopped ovnkube\\\\nI1004 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.154690 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.166855 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.176799 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.190352 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.194255 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.194290 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.194299 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.194313 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.194322 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.206882 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1004 04:46:47.309646 5728 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309684 5728 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309799 5728 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310063 5728 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310347 5728 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310539 5728 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310577 5728 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 04:46:47.310834 5728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 04:46:47.310883 5728 factory.go:656] Stopping watch factory\\\\nI1004 04:46:47.310920 5728 ovnkube.go:599] Stopped ovnkube\\\\nI1004 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.219861 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.229391 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.241720 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.252731 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.265368 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.279097 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.296029 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.296068 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.296078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.296093 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.296104 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.300893 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.315448 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.329468 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.342763 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.398862 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.398904 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.398914 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.398930 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.398941 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.501206 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.501414 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.501500 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.501571 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.501635 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.603482 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.603532 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.603599 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.603621 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.603632 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.705108 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.705135 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.705144 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.705164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.705178 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.807214 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.807265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.807273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.807287 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.807298 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.909121 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.909164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.909173 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.909188 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.909197 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:48Z","lastTransitionTime":"2025-10-04T04:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.932977 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/1.log" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.933495 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/0.log" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.935608 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33" exitCode=1 Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.935636 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33"} Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.935703 4574 scope.go:117] "RemoveContainer" containerID="346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.936470 4574 scope.go:117] "RemoveContainer" containerID="76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33" Oct 04 04:46:48 crc kubenswrapper[4574]: E1004 04:46:48.937649 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.958069 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.970579 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.980628 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:48 crc kubenswrapper[4574]: I1004 04:46:48.994373 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:48Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.004429 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.010823 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.010866 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.010878 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.010894 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.010905 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.017828 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.036736 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1004 04:46:47.309646 5728 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309684 5728 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309799 5728 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310063 5728 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310347 5728 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310539 5728 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310577 5728 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 04:46:47.310834 5728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 04:46:47.310883 5728 factory.go:656] Stopping watch factory\\\\nI1004 04:46:47.310920 5728 ovnkube.go:599] Stopped ovnkube\\\\nI1004 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.048211 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.059622 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.071574 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.083155 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.095853 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.108550 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.112970 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.113000 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.113009 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.113022 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.113032 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.119318 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.129468 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.215054 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.215097 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.215105 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.215121 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.215130 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.317718 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.317750 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.317759 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.317772 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.317782 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.358959 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.359113 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:47:05.359083231 +0000 UTC m=+51.213226273 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.419953 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.419989 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.419998 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.420013 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.420022 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.460526 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.460574 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.460614 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.460641 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460669 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460729 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460743 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460752 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460804 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460852 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460860 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460867 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460734 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:05.460714651 +0000 UTC m=+51.314857693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460893 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:05.460885556 +0000 UTC m=+51.315028588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460904 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:05.460897987 +0000 UTC m=+51.315041029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.460913 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:05.460909627 +0000 UTC m=+51.315052669 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.522613 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.522654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.522663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.522679 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.522688 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.625744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.625785 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.625794 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.625809 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.625820 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.728298 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.728335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.728345 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.728360 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.728371 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.732777 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.732787 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.732880 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.733005 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.733093 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.733184 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.829986 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.830016 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.830023 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.830037 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.830046 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.931915 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.931957 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.931968 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.931983 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.931992 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.940187 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/1.log" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.967393 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.967442 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.967452 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.967472 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.967483 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.979939 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.983119 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.983153 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.983164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.983181 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.983192 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:49 crc kubenswrapper[4574]: E1004 04:46:49.995848 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:49Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.999293 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.999326 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.999340 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.999361 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:49 crc kubenswrapper[4574]: I1004 04:46:49.999372 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:49Z","lastTransitionTime":"2025-10-04T04:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: E1004 04:46:50.010896 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.014375 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.014417 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.014430 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.014447 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.014458 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: E1004 04:46:50.024926 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.027542 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.027576 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.027587 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.027603 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.027614 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: E1004 04:46:50.039807 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: E1004 04:46:50.039921 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.041253 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.041310 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.041326 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.041343 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.041357 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.143928 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.143963 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.143974 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.143991 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.144003 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.245993 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.246028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.246039 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.246057 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.246067 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.347919 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.347976 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.347986 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.348003 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.348013 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.450723 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.450764 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.450782 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.450803 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.450814 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.497951 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn"] Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.498433 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.500126 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.500282 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.519135 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.531826 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.540527 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.552016 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.553611 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.553661 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.553679 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.553701 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.553717 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.565406 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.572427 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m82q2\" (UniqueName: \"kubernetes.io/projected/458beb2c-7930-4fed-87c1-97ef6193e7ca-kube-api-access-m82q2\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.572489 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/458beb2c-7930-4fed-87c1-97ef6193e7ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.572517 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/458beb2c-7930-4fed-87c1-97ef6193e7ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.572539 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/458beb2c-7930-4fed-87c1-97ef6193e7ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.577502 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.592292 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.609957 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://346ea8078e6d53bfbf0b21bb986c974958c91cd8bdeef1227c7a3861fc3b3151\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"message\\\":\\\"workPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1004 04:46:47.309646 5728 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309684 5728 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.309799 5728 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310063 5728 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310347 5728 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310539 5728 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1004 04:46:47.310577 5728 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1004 04:46:47.310834 5728 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1004 04:46:47.310883 5728 factory.go:656] Stopping watch factory\\\\nI1004 04:46:47.310920 5728 ovnkube.go:599] Stopped ovnkube\\\\nI1004 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.617076 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.617827 4574 scope.go:117] "RemoveContainer" containerID="76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33" Oct 04 04:46:50 crc kubenswrapper[4574]: E1004 04:46:50.617980 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.621944 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.632831 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.644557 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.656533 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.657013 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.657155 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.657269 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.657351 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.659858 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.673814 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m82q2\" (UniqueName: \"kubernetes.io/projected/458beb2c-7930-4fed-87c1-97ef6193e7ca-kube-api-access-m82q2\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.673874 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/458beb2c-7930-4fed-87c1-97ef6193e7ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.673902 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/458beb2c-7930-4fed-87c1-97ef6193e7ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.673920 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/458beb2c-7930-4fed-87c1-97ef6193e7ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.674945 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/458beb2c-7930-4fed-87c1-97ef6193e7ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.675152 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/458beb2c-7930-4fed-87c1-97ef6193e7ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.675270 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.681813 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/458beb2c-7930-4fed-87c1-97ef6193e7ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.689587 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.697444 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m82q2\" (UniqueName: \"kubernetes.io/projected/458beb2c-7930-4fed-87c1-97ef6193e7ca-kube-api-access-m82q2\") pod \"ovnkube-control-plane-749d76644c-gs8xn\" (UID: \"458beb2c-7930-4fed-87c1-97ef6193e7ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.705869 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.717940 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.728416 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.739414 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.749938 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.760413 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.760454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.760466 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.760484 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.760513 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.761818 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.773436 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.782905 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.795295 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.811431 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.814700 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: W1004 04:46:50.828281 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458beb2c_7930_4fed_87c1_97ef6193e7ca.slice/crio-80c4e80e92ebf5cfe830e62347370f617ab5f512fb92025fcf2373d81e0943ff WatchSource:0}: Error finding container 80c4e80e92ebf5cfe830e62347370f617ab5f512fb92025fcf2373d81e0943ff: Status 404 returned error can't find the container with id 80c4e80e92ebf5cfe830e62347370f617ab5f512fb92025fcf2373d81e0943ff Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.842863 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.853816 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.864617 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.864652 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.864665 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.864682 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.864699 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.869551 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.892143 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.905242 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.916721 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.927406 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.938003 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:50Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.947047 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" event={"ID":"458beb2c-7930-4fed-87c1-97ef6193e7ca","Type":"ContainerStarted","Data":"80c4e80e92ebf5cfe830e62347370f617ab5f512fb92025fcf2373d81e0943ff"} Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.966591 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.966625 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.966637 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.966656 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:50 crc kubenswrapper[4574]: I1004 04:46:50.966670 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:50Z","lastTransitionTime":"2025-10-04T04:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.068266 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.068311 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.068322 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.068371 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.068394 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.171208 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.171269 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.171279 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.171294 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.171304 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.273845 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.273881 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.273891 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.273905 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.273915 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.375573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.375619 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.375633 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.375652 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.375666 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.477924 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.477952 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.477963 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.477979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.477987 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.576810 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-stmq5"] Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.577251 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:51 crc kubenswrapper[4574]: E1004 04:46:51.577311 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.580060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.580090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.580098 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.580111 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.580120 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.590583 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.601893 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.612587 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.623699 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.635095 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.658617 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.670667 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.679730 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.681053 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnkr\" (UniqueName: \"kubernetes.io/projected/833018b5-b584-4e77-a95f-fe56f6dd5945-kube-api-access-tsnkr\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.681094 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.682133 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.682159 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.682167 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.682181 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.682189 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.690546 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.708293 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.720108 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.729052 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.732286 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:51 crc kubenswrapper[4574]: E1004 04:46:51.732362 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.732573 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:51 crc kubenswrapper[4574]: E1004 04:46:51.732622 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.732570 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:51 crc kubenswrapper[4574]: E1004 04:46:51.732684 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.743064 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.752213 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.762268 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.772844 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.782063 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnkr\" (UniqueName: \"kubernetes.io/projected/833018b5-b584-4e77-a95f-fe56f6dd5945-kube-api-access-tsnkr\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.782120 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:51 crc kubenswrapper[4574]: E1004 04:46:51.782320 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:51 crc kubenswrapper[4574]: E1004 04:46:51.782373 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:52.282358897 +0000 UTC m=+38.136501939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.783158 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.783654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.783680 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.783688 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.783704 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.783714 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.797882 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnkr\" (UniqueName: \"kubernetes.io/projected/833018b5-b584-4e77-a95f-fe56f6dd5945-kube-api-access-tsnkr\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.885500 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.885532 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.885540 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.885554 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.885562 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.951389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" event={"ID":"458beb2c-7930-4fed-87c1-97ef6193e7ca","Type":"ContainerStarted","Data":"ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.951428 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" event={"ID":"458beb2c-7930-4fed-87c1-97ef6193e7ca","Type":"ContainerStarted","Data":"cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.963359 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.974638 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.986128 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.988166 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.988227 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.988254 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.988273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.988283 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:51Z","lastTransitionTime":"2025-10-04T04:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:51 crc kubenswrapper[4574]: I1004 04:46:51.999084 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:51Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.038312 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.049567 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.068266 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.081185 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.090819 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.090850 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.090858 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.090874 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.090884 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.091105 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.104208 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.119902 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.132271 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.146177 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.158952 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.174128 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.186294 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.192836 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.192878 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.192889 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.192906 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.192917 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.198924 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:52Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.286623 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:52 crc kubenswrapper[4574]: E1004 04:46:52.286803 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:52 crc kubenswrapper[4574]: E1004 04:46:52.287012 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:53.286996186 +0000 UTC m=+39.141139228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.294662 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.294692 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.294701 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.294716 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.294726 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.397966 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.398280 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.398356 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.398431 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.398501 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.500831 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.500863 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.500873 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.500890 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.500901 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.603131 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.603160 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.603170 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.603186 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.603196 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.704995 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.705046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.705057 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.705073 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.705083 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.807455 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.807507 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.807517 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.807534 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.807545 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.909756 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.909808 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.909819 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.909839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:52 crc kubenswrapper[4574]: I1004 04:46:52.909850 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:52Z","lastTransitionTime":"2025-10-04T04:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.012550 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.012597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.012606 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.012622 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.012632 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.114305 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.114341 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.114350 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.114368 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.114383 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.216716 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.216747 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.216756 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.216771 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.216779 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.297977 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:53 crc kubenswrapper[4574]: E1004 04:46:53.298125 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:53 crc kubenswrapper[4574]: E1004 04:46:53.298169 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:55.298156019 +0000 UTC m=+41.152299061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.318600 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.318633 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.318644 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.318658 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.318667 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.421187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.421247 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.421257 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.421276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.421286 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.523916 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.523955 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.523992 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.524007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.524016 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.626068 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.626109 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.626118 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.626131 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.626139 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.728387 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.728426 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.728439 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.728456 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.728467 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.732806 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.732836 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.732884 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:53 crc kubenswrapper[4574]: E1004 04:46:53.732943 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.732810 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:53 crc kubenswrapper[4574]: E1004 04:46:53.733024 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:46:53 crc kubenswrapper[4574]: E1004 04:46:53.733069 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:53 crc kubenswrapper[4574]: E1004 04:46:53.733105 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.830912 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.830976 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.831002 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.831054 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.831087 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.933792 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.933835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.933845 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.933860 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:53 crc kubenswrapper[4574]: I1004 04:46:53.933868 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:53Z","lastTransitionTime":"2025-10-04T04:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.036380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.036419 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.036430 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.036448 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.036460 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.139052 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.139100 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.139111 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.139137 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.139149 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.241573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.241605 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.241616 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.241631 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.241640 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.343581 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.343613 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.343621 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.343635 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.343644 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.445707 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.445754 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.445764 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.445782 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.445794 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.548380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.548427 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.548440 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.548458 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.548471 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.650915 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.650951 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.650961 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.650975 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.650984 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.745806 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.753421 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.753466 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.753479 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.753496 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.753508 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.756935 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.768309 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.777439 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.798753 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.810796 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.821700 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.836267 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.854425 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.856096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.856144 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.856156 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.856170 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.856181 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.872150 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.885012 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.897652 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.908921 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.920325 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.931437 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.943559 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.957707 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:46:54Z is after 2025-08-24T17:21:41Z" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.960260 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.960412 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.960499 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.960578 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:54 crc kubenswrapper[4574]: I1004 04:46:54.960663 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:54Z","lastTransitionTime":"2025-10-04T04:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.062663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.062698 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.062710 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.062726 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.062736 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.165024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.165067 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.165079 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.165095 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.165106 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.267512 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.267568 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.267579 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.267597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.267608 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.316978 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:55 crc kubenswrapper[4574]: E1004 04:46:55.317180 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:55 crc kubenswrapper[4574]: E1004 04:46:55.317285 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:46:59.317229531 +0000 UTC m=+45.171372563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.370334 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.370605 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.370708 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.370799 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.370877 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.473405 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.473654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.473738 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.473830 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.473899 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.575902 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.575935 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.575944 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.575959 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.575968 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.678396 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.678428 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.678438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.678454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.678466 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.732738 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:55 crc kubenswrapper[4574]: E1004 04:46:55.732897 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.732769 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:55 crc kubenswrapper[4574]: E1004 04:46:55.732988 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.732775 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:55 crc kubenswrapper[4574]: E1004 04:46:55.733071 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.732769 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:55 crc kubenswrapper[4574]: E1004 04:46:55.733161 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.780437 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.780468 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.780479 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.780496 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.780506 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.882350 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.882639 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.882785 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.882863 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.882933 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.985412 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.985446 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.985455 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.985468 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:55 crc kubenswrapper[4574]: I1004 04:46:55.985822 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:55Z","lastTransitionTime":"2025-10-04T04:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.087923 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.088477 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.088592 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.088700 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.088809 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.190788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.191121 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.191261 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.191361 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.191449 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.293373 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.293410 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.293421 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.293437 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.293448 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.395581 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.395742 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.395757 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.395774 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.395785 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.497592 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.497846 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.497911 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.497997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.498087 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.600534 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.600770 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.600837 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.600934 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.601023 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.703023 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.703092 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.703109 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.703558 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.703620 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.806019 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.806077 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.806086 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.806102 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.806113 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.907974 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.908013 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.908025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.908041 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:56 crc kubenswrapper[4574]: I1004 04:46:56.908053 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:56Z","lastTransitionTime":"2025-10-04T04:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.009735 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.009776 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.009787 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.009805 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.009816 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.113272 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.113327 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.113340 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.113362 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.113375 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.216400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.216443 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.216454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.216473 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.216484 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.319040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.319595 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.319696 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.319805 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.319883 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.421571 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.421625 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.421638 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.421657 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.421669 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.523754 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.523807 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.523822 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.523839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.523855 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.626097 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.626129 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.626145 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.626162 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.626172 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.728046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.728090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.728102 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.728118 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.728130 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.732368 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.732397 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.732417 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.732421 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:57 crc kubenswrapper[4574]: E1004 04:46:57.732472 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:57 crc kubenswrapper[4574]: E1004 04:46:57.732560 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:46:57 crc kubenswrapper[4574]: E1004 04:46:57.732634 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:57 crc kubenswrapper[4574]: E1004 04:46:57.732746 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.830361 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.830398 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.830407 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.830422 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.830431 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.932448 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.932491 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.932503 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.932517 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:57 crc kubenswrapper[4574]: I1004 04:46:57.932529 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:57Z","lastTransitionTime":"2025-10-04T04:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.034192 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.034249 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.034265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.034281 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.034290 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.135997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.136041 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.136052 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.136068 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.136081 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.238196 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.238269 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.238281 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.238299 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.238311 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.340083 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.340125 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.340134 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.340150 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.340164 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.442630 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.442680 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.442693 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.442710 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.442721 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.544961 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.544996 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.545006 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.545021 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.545030 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.647851 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.647898 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.647912 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.647929 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.647940 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.751066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.751342 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.751351 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.751365 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.751374 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.854649 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.854687 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.854696 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.854709 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.854718 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.956838 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.956881 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.956894 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.956920 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:58 crc kubenswrapper[4574]: I1004 04:46:58.956930 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:58Z","lastTransitionTime":"2025-10-04T04:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.058948 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.058985 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.058993 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.059007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.059016 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.161471 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.161518 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.161530 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.161548 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.161561 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.263140 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.263187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.263198 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.263216 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.263246 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.357115 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:59 crc kubenswrapper[4574]: E1004 04:46:59.357322 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:59 crc kubenswrapper[4574]: E1004 04:46:59.357427 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:07.357405705 +0000 UTC m=+53.211548757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.365461 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.365505 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.365518 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.365534 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.365548 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.468454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.468542 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.468575 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.468597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.468613 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.581610 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.581659 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.581677 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.581697 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.581709 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.683929 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.683970 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.683982 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.683999 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.684010 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.732336 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:46:59 crc kubenswrapper[4574]: E1004 04:46:59.732461 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.732336 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:46:59 crc kubenswrapper[4574]: E1004 04:46:59.732542 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.732344 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.732356 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:46:59 crc kubenswrapper[4574]: E1004 04:46:59.732590 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:46:59 crc kubenswrapper[4574]: E1004 04:46:59.732678 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.787071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.787107 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.787118 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.787133 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.787190 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.889602 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.889636 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.889649 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.889667 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.889683 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.991525 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.991562 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.991570 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.991586 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:46:59 crc kubenswrapper[4574]: I1004 04:46:59.991595 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:46:59Z","lastTransitionTime":"2025-10-04T04:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.094621 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.094664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.094675 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.094695 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.094711 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.167461 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.167509 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.167521 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.167538 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.167551 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: E1004 04:47:00.177875 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:00Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.181097 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.181136 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.181149 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.181167 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.181178 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: E1004 04:47:00.194838 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:00Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.198813 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.198854 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.198867 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.198884 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.198895 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: E1004 04:47:00.209703 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:00Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.212351 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.212371 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.212379 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.212393 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.212402 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: E1004 04:47:00.222929 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:00Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.226482 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.226513 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.226523 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.226539 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.226550 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: E1004 04:47:00.237025 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:00Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:00 crc kubenswrapper[4574]: E1004 04:47:00.237142 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.238420 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.238451 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.238462 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.238509 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.238521 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.340255 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.340302 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.340334 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.340353 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.340364 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.442573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.442643 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.442667 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.442697 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.442718 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.546046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.546114 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.546150 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.546183 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.546204 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.648789 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.648839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.648850 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.648876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.648885 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.750981 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.751017 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.751025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.751046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.751054 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.853487 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.853545 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.853561 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.853585 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.853604 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.956139 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.956218 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.956278 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.956311 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:00 crc kubenswrapper[4574]: I1004 04:47:00.956336 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:00Z","lastTransitionTime":"2025-10-04T04:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.058979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.059019 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.059026 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.059045 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.059056 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.161539 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.161575 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.161583 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.161597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.161605 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.264329 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.264372 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.264380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.264400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.264409 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.366732 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.366774 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.366784 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.366803 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.366816 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.469330 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.469366 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.469374 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.469389 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.469398 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.571584 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.571647 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.571657 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.571671 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.571680 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.674051 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.674083 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.674095 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.674113 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.674124 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.732872 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.732901 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.732904 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.732948 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:01 crc kubenswrapper[4574]: E1004 04:47:01.733009 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:01 crc kubenswrapper[4574]: E1004 04:47:01.733106 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:01 crc kubenswrapper[4574]: E1004 04:47:01.733195 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:01 crc kubenswrapper[4574]: E1004 04:47:01.733287 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.776981 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.777019 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.777223 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.777268 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.777278 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.879559 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.879590 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.879598 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.879611 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.879620 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.981098 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.981158 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.981174 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.981191 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:01 crc kubenswrapper[4574]: I1004 04:47:01.981202 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:01Z","lastTransitionTime":"2025-10-04T04:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.083553 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.083622 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.083657 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.083674 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.083687 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.185509 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.185550 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.185558 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.185571 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.185582 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.287329 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.287456 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.287474 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.287494 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.287504 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.390334 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.390378 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.390388 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.390401 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.390411 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.493307 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.493358 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.493374 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.493400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.493416 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.595958 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.596017 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.596031 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.596046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.596055 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.699503 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.699577 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.700169 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.700206 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.700262 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.802348 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.802377 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.802388 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.802402 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.802414 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.904790 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.904828 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.904836 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.904853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:02 crc kubenswrapper[4574]: I1004 04:47:02.904869 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:02Z","lastTransitionTime":"2025-10-04T04:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.007334 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.007369 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.007382 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.007397 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.007409 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.109784 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.109821 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.109831 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.109848 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.109860 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.212750 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.212790 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.212802 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.212818 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.212828 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.315508 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.315553 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.315562 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.315577 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.315586 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.419675 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.419731 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.419744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.419764 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.419778 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.522614 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.522665 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.522676 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.522693 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.522708 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.625505 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.626387 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.626426 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.626445 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.626458 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.729276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.729322 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.729354 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.729373 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.729388 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.732445 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:03 crc kubenswrapper[4574]: E1004 04:47:03.732619 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.732490 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:03 crc kubenswrapper[4574]: E1004 04:47:03.732833 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.732461 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:03 crc kubenswrapper[4574]: E1004 04:47:03.733049 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.732493 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:03 crc kubenswrapper[4574]: E1004 04:47:03.733321 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.831818 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.832348 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.832440 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.832520 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.832609 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.935586 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.936079 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.936187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.936312 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:03 crc kubenswrapper[4574]: I1004 04:47:03.936420 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:03Z","lastTransitionTime":"2025-10-04T04:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.039669 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.039739 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.039761 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.039791 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.039814 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.142489 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.142763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.142832 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.142915 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.142990 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.245660 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.245742 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.245761 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.245796 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.245819 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.349024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.349095 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.349117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.349146 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.349166 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.452389 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.452445 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.452460 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.452478 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.452492 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.555658 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.555905 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.555982 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.556049 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.556104 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.658378 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.658422 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.658431 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.658447 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.658462 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.733445 4574 scope.go:117] "RemoveContainer" containerID="76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.748681 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.762819 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.762854 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.762863 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.762876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.762888 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.767450 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.783398 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.802402 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.814383 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.826338 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.837409 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.850957 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.865082 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.868273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.868306 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.868318 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.868334 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.868345 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.878122 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.888435 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.902300 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.919831 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.930051 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.937845 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.947673 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.957045 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.970730 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.970793 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.970803 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.970817 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.970828 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:04Z","lastTransitionTime":"2025-10-04T04:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.988308 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/1.log" Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.990072 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954"} Oct 04 04:47:04 crc kubenswrapper[4574]: I1004 04:47:04.991074 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.000364 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:04Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.011750 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.026482 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.037348 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.047097 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.056864 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.066802 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.073043 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.073066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.073074 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.073087 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.073095 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.078254 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.088808 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.100973 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.112935 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.127870 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.144133 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.158421 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.170701 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.174529 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.174569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.174579 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.174595 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.174605 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.191730 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.209272 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:05Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.276835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.276875 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.276885 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.276898 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.276907 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.378883 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.379808 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.379881 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.379950 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.380026 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.419194 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.419624 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:47:37.419608637 +0000 UTC m=+83.273751679 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.482761 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.482993 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.483187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.483319 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.483490 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.520082 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.520117 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.520139 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.520169 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520293 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520310 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520338 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520349 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520315 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520349 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:37.520333751 +0000 UTC m=+83.374476793 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520397 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:37.520388832 +0000 UTC m=+83.374531874 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520407 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:37.520401673 +0000 UTC m=+83.374544715 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520849 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.520955 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.521040 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.521149 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:37.521134724 +0000 UTC m=+83.375277766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.586175 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.586557 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.586653 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.586740 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.586835 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.689788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.690056 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.690261 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.690396 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.690491 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.732755 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.732775 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.732881 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.732783 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.732775 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.732975 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.733048 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.733108 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.792371 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.792412 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.792425 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.792445 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.792457 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.894569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.894839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.894920 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.895024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.895099 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.994613 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/2.log" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.995305 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/1.log" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.996624 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.996654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.996666 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.996681 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.996693 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:05Z","lastTransitionTime":"2025-10-04T04:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.998314 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954" exitCode=1 Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.998343 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954"} Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.998389 4574 scope.go:117] "RemoveContainer" containerID="76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33" Oct 04 04:47:05 crc kubenswrapper[4574]: I1004 04:47:05.999208 4574 scope.go:117] "RemoveContainer" containerID="feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954" Oct 04 04:47:05 crc kubenswrapper[4574]: E1004 04:47:05.999416 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.018673 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.029933 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.039422 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.049316 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.063213 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.077345 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.085981 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.098787 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.098827 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.098837 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.098853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.098864 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.103933 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.121394 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76bc4ea6bba58be7d203c400467e771064c43c5f3d859654fbac4e1dbad67e33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:46:48Z\\\",\\\"message\\\":\\\"13 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-wl5xt\\\\nI1004 04:46:48.708018 5913 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1004 04:46:48.707452 5913 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"f9232b32-e89f-4c8e-acc4-c6801b70dcb0\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/package-server-manager-metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, Add\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.132648 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.141944 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.152455 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.162835 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.173896 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.183074 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.192837 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.201391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.201434 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.201445 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.201464 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.201478 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.205834 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:06Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.303101 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.303163 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.303181 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.303204 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.303220 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.405375 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.406672 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.406850 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.407057 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.407298 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.513154 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.513198 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.513209 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.513225 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.513259 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.615929 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.615977 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.615986 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.616025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.616037 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.717786 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.717817 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.717844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.717859 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.717867 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.821219 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.821313 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.821362 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.821387 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.821404 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.924333 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.924633 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.924733 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.924834 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:06 crc kubenswrapper[4574]: I1004 04:47:06.924936 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:06Z","lastTransitionTime":"2025-10-04T04:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.003283 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/2.log" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.007201 4574 scope.go:117] "RemoveContainer" containerID="feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954" Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.007372 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.021364 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.026791 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.026825 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.026835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.026851 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.026860 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.039580 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.057930 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.072276 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.083917 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.095630 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.105683 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.118111 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.128786 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.128825 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.128835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.128850 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.128859 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.129612 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.141886 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.153984 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.165006 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.175722 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.186869 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.197515 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.221188 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.230557 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.230593 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.230601 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.230633 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.230642 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.232642 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:07Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.332667 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.332703 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.332715 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.332732 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.332744 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.434761 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.434798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.434807 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.434822 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.434832 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.437182 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.437349 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.437408 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:23.437392042 +0000 UTC m=+69.291535084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.536918 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.536967 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.536975 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.536989 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.536997 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.639484 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.639569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.639582 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.639598 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.639609 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.732595 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.732627 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.732640 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.732596 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.732716 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.732808 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.732925 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:07 crc kubenswrapper[4574]: E1004 04:47:07.733028 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.740825 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.740853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.740861 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.740875 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.740893 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.842617 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.842663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.842675 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.842691 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.842701 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.945368 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.945425 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.945439 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.945457 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:07 crc kubenswrapper[4574]: I1004 04:47:07.945474 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:07Z","lastTransitionTime":"2025-10-04T04:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.047909 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.047946 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.047954 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.047969 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.047979 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.150082 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.150129 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.150140 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.150157 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.150169 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.252327 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.252370 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.252381 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.252397 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.252410 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.354708 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.354744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.354753 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.354768 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.354777 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.457352 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.457402 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.457411 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.457426 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.457440 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.560473 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.560793 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.560844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.560869 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.560881 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.663671 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.663709 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.663721 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.663737 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.663748 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.765987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.766032 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.766041 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.766056 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.766066 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.868266 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.868318 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.868332 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.868350 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.868361 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.970174 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.970208 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.970219 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.970252 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:08 crc kubenswrapper[4574]: I1004 04:47:08.970264 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:08Z","lastTransitionTime":"2025-10-04T04:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.072297 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.072325 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.072335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.072349 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.072358 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.174791 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.174837 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.174846 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.174862 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.174873 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.276598 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.276640 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.276653 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.276668 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.276679 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.378987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.379030 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.379038 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.379052 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.379063 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.480827 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.480868 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.480880 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.480897 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.480907 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.582758 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.582814 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.582825 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.582839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.582849 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.684306 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.684345 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.684354 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.684370 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.684381 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.733124 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.733201 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.733246 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:09 crc kubenswrapper[4574]: E1004 04:47:09.733329 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.733363 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:09 crc kubenswrapper[4574]: E1004 04:47:09.733452 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:09 crc kubenswrapper[4574]: E1004 04:47:09.733477 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:09 crc kubenswrapper[4574]: E1004 04:47:09.733522 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.786262 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.786298 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.786307 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.786321 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.786330 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.888145 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.888179 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.888200 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.888216 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.888226 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.990581 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.990620 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.990630 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.990645 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:09 crc kubenswrapper[4574]: I1004 04:47:09.990655 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:09Z","lastTransitionTime":"2025-10-04T04:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.092511 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.092557 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.092569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.092586 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.092649 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.194916 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.194972 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.194987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.195012 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.195028 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.297434 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.297476 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.297486 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.297502 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.297516 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.313777 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.313815 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.313824 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.313839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.313848 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: E1004 04:47:10.327326 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:10Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.330628 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.330667 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.330678 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.330696 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.330708 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: E1004 04:47:10.341625 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:10Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.344723 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.344755 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.344763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.344776 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.344785 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: E1004 04:47:10.356729 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:10Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.359988 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.360024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.360035 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.360052 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.360063 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: E1004 04:47:10.370695 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:10Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.373834 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.373887 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.373899 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.373917 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.373927 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: E1004 04:47:10.384316 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:10Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:10 crc kubenswrapper[4574]: E1004 04:47:10.384433 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.399757 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.399950 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.399968 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.400006 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.400018 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.503433 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.503467 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.503477 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.503492 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.503504 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.605852 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.605896 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.605904 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.605919 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.605928 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.707743 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.707780 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.707788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.707801 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.707810 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.810313 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.810360 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.810387 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.810402 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.810410 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.912278 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.912357 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.912402 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.912446 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:10 crc kubenswrapper[4574]: I1004 04:47:10.912458 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:10Z","lastTransitionTime":"2025-10-04T04:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.014876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.014912 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.014924 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.014939 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.014953 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.117626 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.117670 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.117679 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.117692 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.117701 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.220404 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.220437 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.220448 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.220485 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.220510 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.323014 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.323048 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.323060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.323077 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.323086 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.425029 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.425099 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.425110 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.425126 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.425136 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.528627 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.528667 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.528676 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.528690 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.528699 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.555200 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.565613 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.572177 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.582600 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.595821 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.614882 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.625989 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.635765 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.635815 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.635827 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.635950 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.635999 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.642595 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.654309 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.664496 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.676336 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.687414 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.699030 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.710919 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.728440 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.732267 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.732288 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.732350 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:11 crc kubenswrapper[4574]: E1004 04:47:11.732377 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.732412 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:11 crc kubenswrapper[4574]: E1004 04:47:11.732476 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:11 crc kubenswrapper[4574]: E1004 04:47:11.732545 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:11 crc kubenswrapper[4574]: E1004 04:47:11.732600 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.737824 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.737854 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.737863 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.737877 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.737887 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.740442 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.749605 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.758790 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.768316 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:11Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.840876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.840921 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.840930 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.840945 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.840954 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.942937 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.942978 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.942989 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.943005 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:11 crc kubenswrapper[4574]: I1004 04:47:11.943016 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:11Z","lastTransitionTime":"2025-10-04T04:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.045636 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.045672 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.045680 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.045695 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.045705 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.147952 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.147992 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.148002 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.148018 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.148031 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.250562 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.250596 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.250606 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.250622 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.250633 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.352995 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.353048 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.353060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.353078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.353097 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.455096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.455129 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.455139 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.455154 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.455166 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.557528 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.557580 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.557592 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.557611 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.557623 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.659960 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.659997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.660007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.660023 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.660033 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.762000 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.762040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.762050 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.762066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.762077 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.864677 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.864707 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.864715 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.864728 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.864736 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.966868 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.966903 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.966912 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.966927 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:12 crc kubenswrapper[4574]: I1004 04:47:12.966937 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:12Z","lastTransitionTime":"2025-10-04T04:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.069419 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.069454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.069464 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.069481 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.069492 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.172116 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.172141 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.172148 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.172162 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.172172 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.274734 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.274968 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.275090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.275179 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.275294 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.377937 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.377969 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.377979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.377998 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.378008 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.480318 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.480357 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.480366 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.480379 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.480389 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.582821 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.582852 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.582861 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.582874 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.582884 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.688483 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.688923 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.689007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.689088 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.689202 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.732578 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:13 crc kubenswrapper[4574]: E1004 04:47:13.732840 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.732644 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:13 crc kubenswrapper[4574]: E1004 04:47:13.733186 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.732594 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:13 crc kubenswrapper[4574]: E1004 04:47:13.733636 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.732674 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:13 crc kubenswrapper[4574]: E1004 04:47:13.734048 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.792504 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.792545 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.792560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.792578 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.792589 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.894601 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.894629 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.894637 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.894668 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.894678 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.997007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.997089 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.997112 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.997155 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:13 crc kubenswrapper[4574]: I1004 04:47:13.997187 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:13Z","lastTransitionTime":"2025-10-04T04:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.099975 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.100223 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.100306 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.100381 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.100464 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.202553 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.202593 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.202603 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.202619 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.202629 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.304804 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.304833 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.304843 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.304859 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.304869 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.406856 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.406918 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.406927 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.406939 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.406947 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.508662 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.508868 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.508957 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.509048 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.509178 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.613276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.613989 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.614120 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.614273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.614389 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.716158 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.716204 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.716213 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.716229 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.716256 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.749729 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.775504 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.787969 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2926a536-5d90-47f6-834d-0f5cc18bdf75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.804978 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.816022 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.819359 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.819415 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.819426 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.819443 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.819453 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.829004 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.840163 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.854058 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.866549 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.879562 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.894183 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.907437 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.922160 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.922200 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.922213 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.922255 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.922269 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:14Z","lastTransitionTime":"2025-10-04T04:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.922282 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.934924 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.947204 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.968592 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.982135 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:14 crc kubenswrapper[4574]: I1004 04:47:14.993166 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:14Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.025045 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.025735 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.026040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.026126 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.026198 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.128810 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.129109 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.129175 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.129260 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.129365 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.232543 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.233080 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.233355 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.233614 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.233828 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.337081 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.337454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.337550 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.337641 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.337705 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.440391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.440438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.440451 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.440468 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.440482 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.542322 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.542356 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.542364 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.542400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.542412 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.644982 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.645695 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.645763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.645833 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.645898 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.732927 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.732927 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:15 crc kubenswrapper[4574]: E1004 04:47:15.733352 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.733057 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:15 crc kubenswrapper[4574]: E1004 04:47:15.733625 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.733004 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:15 crc kubenswrapper[4574]: E1004 04:47:15.733809 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:15 crc kubenswrapper[4574]: E1004 04:47:15.733455 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.747995 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.748034 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.748047 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.748066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.748078 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.850028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.850078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.850090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.850110 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.850121 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.952011 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.952047 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.952058 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.952075 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:15 crc kubenswrapper[4574]: I1004 04:47:15.952085 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:15Z","lastTransitionTime":"2025-10-04T04:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.054596 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.054640 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.054654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.054672 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.054683 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.156573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.156627 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.156638 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.156654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.156668 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.259397 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.259424 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.259432 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.259445 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.259456 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.361833 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.361887 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.361904 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.361920 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.361930 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.464263 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.464291 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.464307 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.464326 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.464337 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.565942 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.565976 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.565985 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.566000 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.566009 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.668269 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.668297 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.668305 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.668319 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.668327 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.770228 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.770279 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.770292 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.770315 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.770329 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.872878 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.872924 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.872933 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.872948 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.872957 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.975685 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.975718 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.975730 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.975745 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:16 crc kubenswrapper[4574]: I1004 04:47:16.975755 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:16Z","lastTransitionTime":"2025-10-04T04:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.077988 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.078025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.078033 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.078048 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.078058 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.180250 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.180291 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.180301 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.180316 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.180328 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.282513 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.282539 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.282547 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.282560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.282569 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.385137 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.385194 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.385213 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.385272 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.385285 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.487855 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.487881 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.487889 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.487904 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.487912 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.589997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.590024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.590034 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.590049 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.590058 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.692273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.692318 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.692333 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.692351 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.692363 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.732417 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.732457 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.732494 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:17 crc kubenswrapper[4574]: E1004 04:47:17.732548 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.732566 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:17 crc kubenswrapper[4574]: E1004 04:47:17.732609 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:17 crc kubenswrapper[4574]: E1004 04:47:17.732697 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:17 crc kubenswrapper[4574]: E1004 04:47:17.732768 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.795037 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.795075 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.795085 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.795101 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.795111 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.897376 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.897439 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.897618 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.897637 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.897648 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.999845 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.999878 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.999887 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.999901 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:17 crc kubenswrapper[4574]: I1004 04:47:17.999909 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:17Z","lastTransitionTime":"2025-10-04T04:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.103166 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.103210 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.103223 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.103266 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.103281 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.206302 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.206350 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.206363 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.206381 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.206396 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.308300 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.308328 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.308337 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.308350 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.308358 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.410944 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.410979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.410990 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.411006 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.411017 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.513207 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.513295 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.513304 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.513317 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.513541 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.615192 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.615220 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.615228 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.615263 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.615271 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.717220 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.717279 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.717288 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.717301 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.717312 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.819261 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.819287 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.819296 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.819309 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.819317 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.921587 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.921617 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.921625 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.921638 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:18 crc kubenswrapper[4574]: I1004 04:47:18.921647 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:18Z","lastTransitionTime":"2025-10-04T04:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.023353 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.023381 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.023391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.023406 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.023416 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.126106 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.126143 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.126154 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.126171 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.126181 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.228577 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.228615 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.228623 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.228636 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.228645 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.331291 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.331331 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.331342 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.331361 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.331373 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.433108 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.433158 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.433168 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.433181 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.433189 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.535621 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.535656 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.535664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.535678 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.535689 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.638386 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.638422 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.638431 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.638446 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.638456 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.732389 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.732431 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.732465 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.732488 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:19 crc kubenswrapper[4574]: E1004 04:47:19.732526 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:19 crc kubenswrapper[4574]: E1004 04:47:19.732638 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:19 crc kubenswrapper[4574]: E1004 04:47:19.732706 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:19 crc kubenswrapper[4574]: E1004 04:47:19.732793 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.740558 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.740597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.740607 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.740623 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.740634 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.842513 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.842560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.842572 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.842590 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.842602 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.944727 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.944767 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.944780 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.944795 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:19 crc kubenswrapper[4574]: I1004 04:47:19.944805 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:19Z","lastTransitionTime":"2025-10-04T04:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.046485 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.046530 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.046539 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.046555 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.046564 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.148820 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.148854 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.148862 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.148876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.148886 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.251194 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.251255 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.251272 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.251287 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.251296 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.354025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.354062 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.354078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.354102 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.354114 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.456399 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.456457 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.456468 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.456487 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.456498 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.513936 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.513979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.513987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.514005 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.514016 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.525869 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:20Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.530204 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.530267 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.530282 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.530309 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.530342 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.545172 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:20Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.548117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.548159 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.548169 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.548186 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.548198 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.558688 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:20Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.562071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.562174 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.562187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.562206 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.562222 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.575132 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:20Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.578559 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.578585 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.578594 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.578608 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.578617 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.589812 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:20Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.589967 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.591223 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.591263 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.591273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.591286 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.591305 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.693079 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.693112 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.693121 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.693139 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.693149 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.733390 4574 scope.go:117] "RemoveContainer" containerID="feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954" Oct 04 04:47:20 crc kubenswrapper[4574]: E1004 04:47:20.733619 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.794654 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.794711 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.794722 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.794742 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.794753 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.897498 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.897546 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.897558 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.897576 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.897587 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.999372 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.999401 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.999410 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.999426 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:20 crc kubenswrapper[4574]: I1004 04:47:20.999437 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:20Z","lastTransitionTime":"2025-10-04T04:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.101168 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.101207 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.101219 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.101265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.101285 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.203340 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.203367 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.203376 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.203390 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.203399 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.306090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.306149 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.306161 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.306176 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.306186 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.408041 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.408110 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.408122 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.408141 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.408153 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.511069 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.511113 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.511124 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.511140 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.511155 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.613932 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.614006 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.614021 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.614037 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.614051 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.716194 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.716251 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.716267 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.716287 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.716299 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.732744 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.732812 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.732823 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.732908 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:21 crc kubenswrapper[4574]: E1004 04:47:21.733017 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:21 crc kubenswrapper[4574]: E1004 04:47:21.733104 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:21 crc kubenswrapper[4574]: E1004 04:47:21.733177 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:21 crc kubenswrapper[4574]: E1004 04:47:21.733291 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.818330 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.818372 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.818383 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.818399 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.818410 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.921391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.921423 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.921432 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.921448 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:21 crc kubenswrapper[4574]: I1004 04:47:21.921457 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:21Z","lastTransitionTime":"2025-10-04T04:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.023889 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.024007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.024022 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.024038 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.024067 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.126019 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.126048 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.126058 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.126073 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.126081 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.228289 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.228324 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.228336 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.228353 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.228364 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.330456 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.330495 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.330506 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.330524 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.330537 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.432659 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.432712 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.432725 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.432744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.432760 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.534812 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.534844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.534855 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.534871 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.534881 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.636813 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.636851 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.636859 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.636872 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.636881 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.738461 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.738497 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.738507 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.738521 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.738531 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.840807 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.840836 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.840844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.840858 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.840867 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.942735 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.942841 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.942856 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.942872 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:22 crc kubenswrapper[4574]: I1004 04:47:22.942885 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:22Z","lastTransitionTime":"2025-10-04T04:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.044612 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.044643 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.044652 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.044666 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.044675 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.147100 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.147140 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.147148 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.147163 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.147172 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.249163 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.249206 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.249215 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.249250 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.249260 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.351780 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.351836 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.351845 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.351859 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.351868 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.455380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.455431 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.455441 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.455457 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.455470 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.494977 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:23 crc kubenswrapper[4574]: E1004 04:47:23.495182 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:47:23 crc kubenswrapper[4574]: E1004 04:47:23.495561 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:47:55.495536699 +0000 UTC m=+101.349679741 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.557784 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.557820 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.557829 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.558035 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.558044 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.661054 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.661096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.661105 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.661120 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.661130 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.732449 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.732461 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:23 crc kubenswrapper[4574]: E1004 04:47:23.732579 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.732469 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:23 crc kubenswrapper[4574]: E1004 04:47:23.732634 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.732477 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:23 crc kubenswrapper[4574]: E1004 04:47:23.732668 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:23 crc kubenswrapper[4574]: E1004 04:47:23.732695 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.763268 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.763311 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.763324 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.763339 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.763352 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.865800 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.865834 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.865844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.865860 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.865873 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.969994 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.970315 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.970404 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.970514 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:23 crc kubenswrapper[4574]: I1004 04:47:23.970600 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:23Z","lastTransitionTime":"2025-10-04T04:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.073275 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.073300 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.073308 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.073322 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.073331 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.176032 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.176594 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.176710 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.176833 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.176945 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.279692 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.279950 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.280061 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.280144 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.280226 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.382702 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.382762 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.382773 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.382797 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.382842 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.485737 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.486011 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.486088 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.486195 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.486325 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.590280 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.590325 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.590343 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.590361 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.590371 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.692551 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.692608 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.692619 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.692635 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.692646 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.742582 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.753085 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.763997 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.774277 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.786436 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.796885 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.796930 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.796951 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.796971 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.796985 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.803365 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.863335 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.879682 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.891831 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.911754 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.911804 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.911816 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.911837 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.911850 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:24Z","lastTransitionTime":"2025-10-04T04:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.923553 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.937804 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.948515 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.960268 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.977719 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:24 crc kubenswrapper[4574]: I1004 04:47:24.988956 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2926a536-5d90-47f6-834d-0f5cc18bdf75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:24Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.003939 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:25Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.013026 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:25Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.013927 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.014032 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.014117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.014201 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.014318 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.027186 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:25Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.116704 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.116736 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.116744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.116759 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.116768 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.218967 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.219003 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.219013 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.219030 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.219039 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.321973 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.322026 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.322037 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.322060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.322073 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.424106 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.424141 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.424152 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.424172 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.424185 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.526099 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.526133 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.526143 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.526157 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.526167 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.628986 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.629029 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.629040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.629056 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.629069 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.731269 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.731312 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.731325 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.731341 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.731351 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.732473 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.732492 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.732504 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.732538 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:25 crc kubenswrapper[4574]: E1004 04:47:25.732580 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:25 crc kubenswrapper[4574]: E1004 04:47:25.732688 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:25 crc kubenswrapper[4574]: E1004 04:47:25.732762 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:25 crc kubenswrapper[4574]: E1004 04:47:25.732824 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.833155 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.833197 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.833208 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.833223 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.833248 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.935417 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.935460 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.935473 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.935491 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:25 crc kubenswrapper[4574]: I1004 04:47:25.935502 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:25Z","lastTransitionTime":"2025-10-04T04:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.037467 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.037505 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.037513 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.037531 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.037541 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.140734 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.140782 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.140791 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.140808 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.140817 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.243798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.243844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.243858 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.243876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.243887 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.345703 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.345746 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.345760 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.345776 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.345785 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.448285 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.448317 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.448327 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.448346 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.448360 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.550692 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.550742 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.550759 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.550783 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.550800 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.656028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.656069 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.656077 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.656090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.656098 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.757771 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.757807 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.757820 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.757836 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.757845 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.859965 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.860004 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.860015 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.860032 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.860042 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.962519 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.962550 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.962559 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.962573 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:26 crc kubenswrapper[4574]: I1004 04:47:26.962581 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:26Z","lastTransitionTime":"2025-10-04T04:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.061724 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/0.log" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.061991 4574 generic.go:334] "Generic (PLEG): container finished" podID="649982aa-c9c5-41ce-a056-48ad058e9aa5" containerID="c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b" exitCode=1 Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.062024 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerDied","Data":"c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.062473 4574 scope.go:117] "RemoveContainer" containerID="c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.066451 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.066501 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.066509 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.066525 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.066539 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.077143 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.097858 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.110109 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2926a536-5d90-47f6-834d-0f5cc18bdf75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.122422 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.134450 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.146429 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.157137 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.167918 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.169051 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.169077 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.169085 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.169100 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.169108 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.179063 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.189890 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:26Z\\\",\\\"message\\\":\\\"2025-10-04T04:46:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb\\\\n2025-10-04T04:46:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb to /host/opt/cni/bin/\\\\n2025-10-04T04:46:41Z [verbose] multus-daemon started\\\\n2025-10-04T04:46:41Z [verbose] Readiness Indicator file check\\\\n2025-10-04T04:47:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.204617 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.216793 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.226748 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.236841 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.245778 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.266021 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.274197 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.274273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.274285 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.274303 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.274317 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.280961 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.290911 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:27Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.377025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.377062 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.377073 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.377089 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.377099 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.478819 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.478852 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.478861 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.478874 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.478884 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.581358 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.581390 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.581400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.581415 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.581425 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.683118 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.683154 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.683164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.683180 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.683190 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.732481 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.732533 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.732600 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:27 crc kubenswrapper[4574]: E1004 04:47:27.732600 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.732489 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:27 crc kubenswrapper[4574]: E1004 04:47:27.732694 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:27 crc kubenswrapper[4574]: E1004 04:47:27.732775 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:27 crc kubenswrapper[4574]: E1004 04:47:27.732837 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.786008 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.786053 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.786062 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.786078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.786088 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.888060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.888102 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.888115 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.888133 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.888146 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.990893 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.990927 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.990936 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.990949 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:27 crc kubenswrapper[4574]: I1004 04:47:27.990958 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:27Z","lastTransitionTime":"2025-10-04T04:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.068133 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/0.log" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.068195 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerStarted","Data":"231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.089420 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.093255 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.093296 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.093308 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.093327 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.093341 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.103700 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.114408 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.125350 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.135390 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.147007 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2926a536-5d90-47f6-834d-0f5cc18bdf75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.161429 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.172200 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.185715 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.195056 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.195095 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.195106 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.195122 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.195133 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.204824 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.217333 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.230444 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.242899 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.253858 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.267857 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.280182 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.290877 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.296689 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.296717 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.296727 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.296743 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.296753 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.302768 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:26Z\\\",\\\"message\\\":\\\"2025-10-04T04:46:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb\\\\n2025-10-04T04:46:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb to /host/opt/cni/bin/\\\\n2025-10-04T04:46:41Z [verbose] multus-daemon started\\\\n2025-10-04T04:46:41Z [verbose] Readiness Indicator file check\\\\n2025-10-04T04:47:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:47:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:28Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.400521 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.400559 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.400569 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.400585 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.400596 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.503001 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.503054 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.503065 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.503083 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.503095 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.605520 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.605561 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.605572 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.605588 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.605598 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.707832 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.707869 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.707880 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.707897 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.707907 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.810173 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.810208 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.810219 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.810257 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.810274 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.913196 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.913255 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.913271 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.913321 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:28 crc kubenswrapper[4574]: I1004 04:47:28.913333 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:28Z","lastTransitionTime":"2025-10-04T04:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.016070 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.016102 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.016112 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.016128 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.016138 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.118811 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.118844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.118854 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.118870 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.118879 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.220986 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.221026 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.221036 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.221049 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.221059 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.322936 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.322970 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.322988 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.323009 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.323020 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.424935 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.424987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.424997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.425014 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.425025 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.527950 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.527985 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.527997 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.528015 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.528026 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.630326 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.630365 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.630375 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.630391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.630403 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.732329 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.732509 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:29 crc kubenswrapper[4574]: E1004 04:47:29.732623 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.732645 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.732670 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:29 crc kubenswrapper[4574]: E1004 04:47:29.732811 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:29 crc kubenswrapper[4574]: E1004 04:47:29.732862 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:29 crc kubenswrapper[4574]: E1004 04:47:29.732903 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.733473 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.733507 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.733516 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.733530 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.733541 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.835597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.835639 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.835648 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.835665 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.835677 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.937826 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.937877 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.937905 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.937923 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:29 crc kubenswrapper[4574]: I1004 04:47:29.937934 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:29Z","lastTransitionTime":"2025-10-04T04:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.041075 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.041140 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.041157 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.041179 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.041194 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.143776 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.143856 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.143872 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.143900 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.143918 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.249118 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.249156 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.249167 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.249184 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.249195 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.351324 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.351364 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.351376 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.351391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.351405 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.454724 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.454807 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.454821 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.454836 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.454870 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.558542 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.558586 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.558598 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.558614 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.558624 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.661542 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.661592 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.661604 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.661625 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.661636 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.685454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.685495 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.685508 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.685523 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.685532 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: E1004 04:47:30.699447 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:30Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.705275 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.705316 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.705328 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.705347 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.705357 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: E1004 04:47:30.717825 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:30Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.724508 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.724570 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.724588 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.724613 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.724638 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: E1004 04:47:30.738948 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:30Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.744001 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.744046 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.744058 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.744078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.744091 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: E1004 04:47:30.755921 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:30Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.759091 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.759135 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.759146 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.759159 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.759169 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: E1004 04:47:30.770252 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:30Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:30 crc kubenswrapper[4574]: E1004 04:47:30.770427 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.772003 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.772030 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.772039 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.772055 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.772065 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.876116 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.876193 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.876212 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.876265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.876286 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.983969 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.984319 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.984453 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.984543 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:30 crc kubenswrapper[4574]: I1004 04:47:30.984625 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:30Z","lastTransitionTime":"2025-10-04T04:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.087179 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.087254 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.087271 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.087286 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.087297 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.190090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.190164 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.190184 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.190212 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.190256 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.294505 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.294540 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.294549 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.294564 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.294575 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.397006 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.397060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.397072 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.397092 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.397106 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.499851 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.499893 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.499903 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.499920 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.499933 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.602546 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.602588 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.602601 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.602617 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.602627 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.704916 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.705156 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.705243 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.705321 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.705389 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.732549 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.732582 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.732585 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.732647 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:31 crc kubenswrapper[4574]: E1004 04:47:31.733137 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:31 crc kubenswrapper[4574]: E1004 04:47:31.733024 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:31 crc kubenswrapper[4574]: E1004 04:47:31.732901 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:31 crc kubenswrapper[4574]: E1004 04:47:31.733165 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.808037 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.808326 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.808412 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.808484 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.808547 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.910523 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.910830 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.910914 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.910980 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:31 crc kubenswrapper[4574]: I1004 04:47:31.911040 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:31Z","lastTransitionTime":"2025-10-04T04:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.013335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.013381 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.013393 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.013411 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.013424 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.115527 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.115563 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.115571 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.115588 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.115598 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.218198 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.218265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.218276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.218296 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.218305 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.320459 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.320488 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.320506 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.320524 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.320537 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.423581 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.423632 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.423644 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.423663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.423677 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.526040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.526075 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.526084 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.526099 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.526110 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.628898 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.628945 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.628959 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.628979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.628991 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.731393 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.731451 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.731465 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.731485 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.731504 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.735578 4574 scope.go:117] "RemoveContainer" containerID="feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.833760 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.834079 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.834199 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.834460 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.834537 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.936873 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.936919 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.936929 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.936944 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:32 crc kubenswrapper[4574]: I1004 04:47:32.936953 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:32Z","lastTransitionTime":"2025-10-04T04:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.042464 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.042531 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.042544 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.042560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.042572 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.145572 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.145615 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.145624 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.145643 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.145654 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.248286 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.248312 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.248321 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.248335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.248344 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.351120 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.351166 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.351175 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.351191 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.351205 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.453187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.453220 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.453228 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.453264 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.453275 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.555376 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.555406 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.555415 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.555430 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.555438 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.658008 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.658043 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.658053 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.658068 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.658090 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.732359 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.732399 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.732452 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.732473 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:33 crc kubenswrapper[4574]: E1004 04:47:33.732473 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:33 crc kubenswrapper[4574]: E1004 04:47:33.732560 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:33 crc kubenswrapper[4574]: E1004 04:47:33.732597 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:33 crc kubenswrapper[4574]: E1004 04:47:33.732642 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.759913 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.759947 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.759956 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.759970 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.760001 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.862979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.863028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.863040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.863057 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.863069 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.964853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.964883 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.964894 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.964909 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:33 crc kubenswrapper[4574]: I1004 04:47:33.964921 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:33Z","lastTransitionTime":"2025-10-04T04:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.066826 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.066857 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.066868 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.066881 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.066890 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.088829 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/3.log" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.089703 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/2.log" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.092666 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" exitCode=1 Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.092710 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.092757 4574 scope.go:117] "RemoveContainer" containerID="feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.093458 4574 scope.go:117] "RemoveContainer" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" Oct 04 04:47:34 crc kubenswrapper[4574]: E1004 04:47:34.093653 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.113110 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.125073 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.135779 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.145315 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.154569 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.163383 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2926a536-5d90-47f6-834d-0f5cc18bdf75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.173987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.174017 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.174025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.174038 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.174046 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.174694 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.183845 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.197653 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.216315 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:33Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:33.943332 6478 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1004 04:47:33.942881 6478 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.227838 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.241012 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.252941 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.265015 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.276328 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.276369 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.276380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.276400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.276413 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.277833 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.291891 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.302804 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.315714 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:26Z\\\",\\\"message\\\":\\\"2025-10-04T04:46:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb\\\\n2025-10-04T04:46:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb to /host/opt/cni/bin/\\\\n2025-10-04T04:46:41Z [verbose] multus-daemon started\\\\n2025-10-04T04:46:41Z [verbose] Readiness Indicator file check\\\\n2025-10-04T04:47:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:47:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.377975 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.378014 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.378024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.378040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.378052 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.480696 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.480744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.480752 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.480767 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.480778 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.583055 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.583089 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.583100 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.583116 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.583127 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.684525 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.684760 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.684770 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.684784 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.684794 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.745054 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c8774514771549ae7ad9b6d81cd550bb78064f54fcc07e7fa3e6e53e86bf41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.756457 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6wsfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"649982aa-c9c5-41ce-a056-48ad058e9aa5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:26Z\\\",\\\"message\\\":\\\"2025-10-04T04:46:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb\\\\n2025-10-04T04:46:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc77fdc5-6fff-4793-af00-7470011927cb to /host/opt/cni/bin/\\\\n2025-10-04T04:46:41Z [verbose] multus-daemon started\\\\n2025-10-04T04:46:41Z [verbose] Readiness Indicator file check\\\\n2025-10-04T04:47:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:47:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qs2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6wsfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.768638 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba50beb-c1bf-4acd-ac00-280b9f862d67\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0521af43f6821b3da01b811f020d874e2c65ddb62013aa84d1d7c0ffecc3b53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba2df185a339264ccbd8ac491569859ad2045c1b6bd3b7c60e2c5a3328ba063\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc3f9d46e64357bae686e9848fc265b2ebbf1ca62009b857e93692c03350ff5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://986c3babbaa14d92000415c9de6698743f566049d3f5dd69e0c53589a3ce70d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38f9278c02108af9a0c5eb42c7e73e9f1f115baa97b5688d16fa2b02fe0432f6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1004 04:46:33.122816 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1004 04:46:33.122824 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1004 04:46:33.122830 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759553177\\\\\\\\\\\\\\\" (2025-10-04 04:46:17 +0000 UTC to 2025-11-03 04:46:18 +0000 UTC (now=2025-10-04 04:46:33.122808362 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122897 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1004 04:46:33.122907 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1004 04:46:33.122929 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759553178\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759553178\\\\\\\\\\\\\\\" (2025-10-04 03:46:18 +0000 UTC to 2026-10-04 03:46:18 +0000 UTC (now=2025-10-04 04:46:33.122911335 +0000 UTC))\\\\\\\"\\\\nI1004 04:46:33.122950 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1004 04:46:33.122969 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1004 04:46:33.122985 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1131066998/tls.crt::/tmp/serving-cert-1131066998/tls.key\\\\\\\"\\\\nI1004 04:46:33.123074 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF1004 04:46:33.123447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbab167ffc5fc0df9088924b89ebd6ce7ac21da05745c7883cbde18be442685f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7dda9ff0ad6764620d2007845543ad2fc8e711d399b1bda471eadf1c2410873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.778863 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b7cd3ea-da00-4026-9ac0-c5eb967d9b71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01405323df999716b77ebe2a72897449cf76300f0799ff72b1c39fc21fa23713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0648b2331d68cbe1a4087218240e0454eb5cab7ff5480b574d7dbb9ebba4edc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f537a14e709f34f08da73b70f44af162025155117d42957980f6ef1ebf85fd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d5e0f3f1bc6ea83bc511ff4c4dd9ba3490a2cf47387852b1ebb99b6aeefded\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.786209 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.786290 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.786301 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.786314 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.786323 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.789761 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17493fd3-2995-469d-bd5a-5158f2866895\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5deb39f647602f3cedcd4604f867975e0960d63051f7c38d19e432721c43ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l2zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.803204 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"458beb2c-7930-4fed-87c1-97ef6193e7ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa0e63bbfbd2156ecb04abce3c97a03171d3026a1bb85133f174a13b877b72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad60080fa931b6415bc4aaa2fc80397fb505a8310fb1fd17d6535aa9b6e1eb4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m82q2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gs8xn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.817389 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-stmq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"833018b5-b584-4e77-a95f-fe56f6dd5945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tsnkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-stmq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.836722 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"123b6811-7dbb-4a63-aab9-1aa786c11c3c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179f983c323b32fa72abcae9974f642dcf4f72815f7083259f274c072dfdcde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc7341ff5c0ddf5557578364c5006e7b48915a29872d58fb96fe91fbb8dcf77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c61267868d0c668a12d0e271372f41ad5b7888815c9d60f52fdd5f116a8a4bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b4f29fb3bf60523b5835e679aebf0fc4c2e009f0ecab90ac34999167a345d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c46003566687b655b1a831539ac81e94b1ef674d1e81996b9c96814d73aeeaae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97ab2930086fdbf5f2b85916d1e178100a6fcff22ded9b88989f9c2cf81e4da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94cce937a3868b3122989dea8a885b4e699a54de15f5efb048ca9136c6645b8a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8e18909921262e2ad3811a3d919922ceaa7a1c56abd4f7df207b6b6f0d2673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.848751 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0117ab20a0f1ff62537423abb58be9a7055616a664cc7617b12481145e2637aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be80ff78a73de11c41d728e3b74293299a46f36f2c5e1f3524a86cb815acc0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.858927 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dmzfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96222110-95c8-4caa-b42a-7526e39ae0e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1cf80a376e190d10477faf6ab40bbb170a72eef455bab8dacb2af4777068cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vtgkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dmzfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.875954 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d896311-2a08-4a70-b74e-2a9b10abc7ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1408910c1afe780d29acbfc0d814f8575cedac8bde12c1ca280d11c20e2b4df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db111d4e777ff976a8658acd98c6c86f467f16e991dcf194891dc2941ca25e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1645f66d12e3bb6fa28496b0c1c4686877989d591c57b44f82d1e9399f4a475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://612a0ac2c6551480211733824d490bfafc259d892302f7d02746b158695b8e12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801268ca7e7b01ca41552da5bbc6bf5ff3d4e5d0954e01bc73f2e79ffd6dd3c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1717238882d23da060017dfd367959080eac4870b3586692aec52c178eb209ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b393805af581d7af9edfd100a3c5ff3e0ff8014183fe2368055d46258114e15c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgjqq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b9dlv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.888027 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.888080 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.888092 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.888109 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.888121 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.895403 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e473790c-4fad-4637-9d72-0dd6310b4ae0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feece3e0cf04ec54e732b1e5ae00150bb4eed7ff132c6aadceb893d8edb6a954\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:05Z\\\",\\\"message\\\":\\\"ows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:05.598290 6130 services_controller.go:356] Processing sync for service openshift-kube-scheduler/scheduler for network=default\\\\nI1004 04:47:05.598295 6130 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1004 04:47:05.596332 6130 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI1004 04:47:05.598327 6130 obj_retry.go:409] Going to retry *v1.Pod resource setup for 9 objects: [openshift-multus/network-metrics-daemon-stmq5 openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn openshift-dns/node-resolver-dmzfp openshift-machine-config-operator/machine-config-daemon-wl5xt openshift-multus/multus-6wsfn openshift-etcd/etcd-crc openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-node-ntdng]\\\\nI1004 04:47:05.598349 6130 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1004 04:47:05.598364 6130 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-04T04:47:33Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1004 04:47:33.943332 6478 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1004 04:47:33.942881 6478 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-04T04:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-znkbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ntdng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.906751 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2926a536-5d90-47f6-834d-0f5cc18bdf75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://718dd5c90711b2958bc92eaabc759a550e347e5b7ea17b27a562a5b6a9b1f7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5de8b7c2160d35c3a66208e600b51e26b0f637d8367954ef6cf0f0ab502ba40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05d7fc97e7eaec546ada9f50953b3522e334170238d4a8952b5aeed65e5b8b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a34d998e05a531630d901f20f0d5e0738377268087a771814620b08d5e39ff27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-04T04:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-04T04:46:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.918906 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://235d253a8175dcc681549b7ad63f8e1db8d3473efe066a405bce99d45c7ce84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.930384 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.941649 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.951785 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75910bdc-1940-4d15-b390-4bcfcec9f72c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b52578b28d4033aa056c4216d391b332be77c6acd8a4e381f10aef651c0d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-04T04:46:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lclg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-04T04:46:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wl5xt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.961908 4574 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-04T04:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:34Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.990987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.991297 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.991469 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.991599 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:34 crc kubenswrapper[4574]: I1004 04:47:34.991709 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:34Z","lastTransitionTime":"2025-10-04T04:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.093582 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.093634 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.093646 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.093664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.093675 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.096930 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/3.log" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.195872 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.196539 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.196632 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.196714 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.196796 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.299465 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.299505 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.299516 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.299533 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.299544 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.401747 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.401777 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.401785 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.401798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.401807 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.503561 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.503599 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.503610 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.503623 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.503633 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.606300 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.606530 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.606607 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.606687 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.606801 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.709319 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.709350 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.709360 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.709377 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.709389 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.732923 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.733009 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:35 crc kubenswrapper[4574]: E1004 04:47:35.733294 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.733096 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:35 crc kubenswrapper[4574]: E1004 04:47:35.733607 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.733009 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:35 crc kubenswrapper[4574]: E1004 04:47:35.733792 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:35 crc kubenswrapper[4574]: E1004 04:47:35.733406 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.812123 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.812323 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.812345 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.812364 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.812374 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.914796 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.914827 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.914835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.914848 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:35 crc kubenswrapper[4574]: I1004 04:47:35.914858 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:35Z","lastTransitionTime":"2025-10-04T04:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.017260 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.017307 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.017319 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.017374 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.017387 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.119760 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.120012 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.120132 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.120205 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.120314 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.222450 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.222475 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.222484 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.222499 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.222510 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.324582 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.324849 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.324977 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.325071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.325153 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.427788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.427817 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.427825 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.427838 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.427847 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.530699 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.530736 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.530753 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.530784 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.530797 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.634594 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.634638 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.634655 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.634677 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.634694 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.736356 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.736395 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.736404 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.736417 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.736429 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.839840 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.839885 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.839895 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.839914 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.839926 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.942359 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.942725 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.942828 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.942922 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:36 crc kubenswrapper[4574]: I1004 04:47:36.943003 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:36Z","lastTransitionTime":"2025-10-04T04:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.045080 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.045114 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.045125 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.045141 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.045153 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.147347 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.147624 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.147792 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.147914 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.148136 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.250553 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.250585 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.250595 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.250610 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.250622 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.352770 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.352816 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.352826 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.352841 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.352851 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.438817 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.438982 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.43896279 +0000 UTC m=+147.293105832 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.455441 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.455476 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.455487 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.455504 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.455516 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.540168 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540300 4574 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.540476 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540527 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.540510708 +0000 UTC m=+147.394653740 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.540602 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.540654 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540777 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540802 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540769 4574 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540895 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.540872999 +0000 UTC m=+147.395016031 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540814 4574 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.541078 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.541058284 +0000 UTC m=+147.395201326 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.540818 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.541268 4574 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.541356 4574 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.541638 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.541628951 +0000 UTC m=+147.395771993 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.558630 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.558663 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.558672 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.558707 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.558718 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.661262 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.661511 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.661594 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.661718 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.661831 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.732933 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.732968 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.734007 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.734005 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.733970 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.734121 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.734160 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:37 crc kubenswrapper[4574]: E1004 04:47:37.734245 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.765606 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.765938 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.766064 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.766171 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.766290 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.868149 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.868180 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.868189 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.868202 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.868211 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.970840 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.970876 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.970886 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.970901 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:37 crc kubenswrapper[4574]: I1004 04:47:37.970912 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:37Z","lastTransitionTime":"2025-10-04T04:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.073066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.073127 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.073145 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.073160 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.073189 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.176302 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.176430 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.176439 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.176472 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.176482 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.279588 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.279647 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.279665 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.279689 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.279707 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.383286 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.383349 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.383369 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.383396 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.383417 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.486822 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.486856 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.486867 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.486885 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.486896 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.592017 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.592076 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.592094 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.592161 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.592198 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.695344 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.695691 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.695823 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.695946 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.696115 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.744572 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.798901 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.798951 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.798962 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.798977 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.798986 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.901396 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.901443 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.901456 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.901469 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:38 crc kubenswrapper[4574]: I1004 04:47:38.901477 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:38Z","lastTransitionTime":"2025-10-04T04:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.003341 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.003653 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.003664 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.003681 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.003708 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.105542 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.105614 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.105624 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.105662 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.105679 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.208097 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.208435 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.208562 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.208682 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.208773 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.311457 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.311504 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.311517 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.311534 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.311545 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.414265 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.414324 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.414338 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.414377 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.414392 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.516700 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.516948 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.517018 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.517088 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.517145 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.619096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.619139 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.619151 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.619167 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.619178 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.721746 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.721980 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.722042 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.722116 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.722181 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.732306 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.732380 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.732362 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.732399 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:39 crc kubenswrapper[4574]: E1004 04:47:39.732924 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:39 crc kubenswrapper[4574]: E1004 04:47:39.732948 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:39 crc kubenswrapper[4574]: E1004 04:47:39.732617 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:39 crc kubenswrapper[4574]: E1004 04:47:39.732754 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.824193 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.824276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.824288 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.824304 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.824315 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.927017 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.927050 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.927060 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.927074 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:39 crc kubenswrapper[4574]: I1004 04:47:39.927084 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:39Z","lastTransitionTime":"2025-10-04T04:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.029047 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.029088 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.029096 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.029111 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.029121 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.130952 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.130999 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.131009 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.131025 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.131037 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.233533 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.233576 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.233587 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.233603 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.233615 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.335822 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.335853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.335864 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.335880 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.335891 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.438276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.438308 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.438317 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.438334 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.438345 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.540592 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.540627 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.540636 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.540649 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.540658 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.642948 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.643358 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.643487 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.643568 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.643657 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.746024 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.746064 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.746074 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.746089 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.746099 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.848913 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.849330 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.849435 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.849541 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.849635 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.909065 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.909101 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.909110 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.909123 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.909133 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: E1004 04:47:40.920576 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.924531 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.924574 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.924588 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.924606 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.924617 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: E1004 04:47:40.938360 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.942192 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.942284 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.942294 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.942335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.942349 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: E1004 04:47:40.958575 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.962835 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.962888 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.962905 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.962928 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.962945 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: E1004 04:47:40.980655 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.984793 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.984841 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.984851 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.984866 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:40 crc kubenswrapper[4574]: I1004 04:47:40.984876 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:40Z","lastTransitionTime":"2025-10-04T04:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:40 crc kubenswrapper[4574]: E1004 04:47:40.998313 4574 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-04T04:47:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3b060499-a4fb-4547-9cda-a86b5d4fd2fa\\\",\\\"systemUUID\\\":\\\"9757b487-9d09-40ae-a5ee-25ae49bc71e6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-04T04:47:40Z is after 2025-08-24T17:21:41Z" Oct 04 04:47:40 crc kubenswrapper[4574]: E1004 04:47:40.998487 4574 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.000897 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.000945 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.000961 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.000982 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.001001 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.103492 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.103849 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.103964 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.104067 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.104156 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.207186 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.207248 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.207258 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.207273 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.207283 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.309497 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.309551 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.309560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.309575 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.309586 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.412028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.412058 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.412066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.412079 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.412088 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.515064 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.515094 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.515103 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.515118 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.515130 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.617970 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.618028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.618040 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.618059 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.618072 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.720683 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.720732 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.720745 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.720763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.720776 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.733168 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.733187 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:41 crc kubenswrapper[4574]: E1004 04:47:41.733333 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.733304 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.733188 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:41 crc kubenswrapper[4574]: E1004 04:47:41.733466 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:41 crc kubenswrapper[4574]: E1004 04:47:41.733508 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:41 crc kubenswrapper[4574]: E1004 04:47:41.733554 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.825912 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.825964 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.825976 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.825994 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.826007 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.928953 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.928986 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.928994 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.929007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:41 crc kubenswrapper[4574]: I1004 04:47:41.929018 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:41Z","lastTransitionTime":"2025-10-04T04:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.030953 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.030987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.030998 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.031013 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.031023 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.133582 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.133629 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.133641 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.133660 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.133713 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.235923 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.235964 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.235979 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.235995 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.236005 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.337943 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.337975 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.337985 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.337999 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.338007 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.440472 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.440510 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.440521 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.440537 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.440549 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.543080 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.543122 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.543132 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.543149 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.543162 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.646555 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.646592 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.646609 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.646648 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.646660 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.749488 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.749530 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.749542 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.749560 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.749574 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.852820 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.852862 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.852871 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.852888 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.852898 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.955433 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.955477 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.955488 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.955509 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:42 crc kubenswrapper[4574]: I1004 04:47:42.955519 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:42Z","lastTransitionTime":"2025-10-04T04:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.057753 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.057994 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.058157 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.058293 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.058385 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.169480 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.169827 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.169902 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.170021 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.170100 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.272390 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.272429 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.272442 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.272458 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.272470 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.375448 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.375489 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.375500 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.375516 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.375529 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.477980 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.478833 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.478920 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.479000 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.479071 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.582109 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.582415 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.582523 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.582623 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.582703 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.684766 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.685014 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.685089 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.685162 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.685227 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.732528 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.732562 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.732569 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:43 crc kubenswrapper[4574]: E1004 04:47:43.732660 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:43 crc kubenswrapper[4574]: E1004 04:47:43.732769 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:43 crc kubenswrapper[4574]: E1004 04:47:43.732841 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.733087 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:43 crc kubenswrapper[4574]: E1004 04:47:43.733300 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.787852 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.787894 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.787902 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.787916 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.787927 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.895123 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.895426 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.895520 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.895614 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.895695 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.998752 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.999071 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.999159 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.999271 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:43 crc kubenswrapper[4574]: I1004 04:47:43.999357 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:43Z","lastTransitionTime":"2025-10-04T04:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.102365 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.102401 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.102413 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.102430 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.102441 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.204715 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.204742 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.204750 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.204763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.204775 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.306730 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.306771 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.306782 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.306798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.306811 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.408865 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.408892 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.408904 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.408916 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.408935 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.511558 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.511597 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.511608 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.511622 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.511632 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.618117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.618186 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.618221 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.618268 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.618288 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.720736 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.720804 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.720824 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.720848 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.720860 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.749351 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.749333071 podStartE2EDuration="6.749333071s" podCreationTimestamp="2025-10-04 04:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.749008611 +0000 UTC m=+90.603151653" watchObservedRunningTime="2025-10-04 04:47:44.749333071 +0000 UTC m=+90.603476113" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.820085 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.820067912 podStartE2EDuration="1m11.820067912s" podCreationTimestamp="2025-10-04 04:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.820007281 +0000 UTC m=+90.674150323" watchObservedRunningTime="2025-10-04 04:47:44.820067912 +0000 UTC m=+90.674210954" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.820224 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podStartSLOduration=67.820218117 podStartE2EDuration="1m7.820218117s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.799302581 +0000 UTC m=+90.653445623" watchObservedRunningTime="2025-10-04 04:47:44.820218117 +0000 UTC m=+90.674361159" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.822599 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.822631 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.822640 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.822652 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.822664 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.834073 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.834055404 podStartE2EDuration="1m11.834055404s" podCreationTimestamp="2025-10-04 04:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.833430966 +0000 UTC m=+90.687574018" watchObservedRunningTime="2025-10-04 04:47:44.834055404 +0000 UTC m=+90.688198446" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.884366 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6wsfn" podStartSLOduration=67.884352284 podStartE2EDuration="1m7.884352284s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.861033578 +0000 UTC m=+90.715176620" watchObservedRunningTime="2025-10-04 04:47:44.884352284 +0000 UTC m=+90.738495326" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.896047 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.896028768 podStartE2EDuration="1m11.896028768s" podCreationTimestamp="2025-10-04 04:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.884711425 +0000 UTC m=+90.738854467" watchObservedRunningTime="2025-10-04 04:47:44.896028768 +0000 UTC m=+90.750171810" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.919703 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6mcbn" podStartSLOduration=67.919684514 podStartE2EDuration="1m7.919684514s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.906944559 +0000 UTC m=+90.761087611" watchObservedRunningTime="2025-10-04 04:47:44.919684514 +0000 UTC m=+90.773827556" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.924618 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.924659 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.924670 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.924685 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.924694 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:44Z","lastTransitionTime":"2025-10-04T04:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.932104 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gs8xn" podStartSLOduration=67.932086389 podStartE2EDuration="1m7.932086389s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.92296559 +0000 UTC m=+90.777108642" watchObservedRunningTime="2025-10-04 04:47:44.932086389 +0000 UTC m=+90.786229431" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.959918 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=33.959901447 podStartE2EDuration="33.959901447s" podCreationTimestamp="2025-10-04 04:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.944817523 +0000 UTC m=+90.798960565" watchObservedRunningTime="2025-10-04 04:47:44.959901447 +0000 UTC m=+90.814044499" Oct 04 04:47:44 crc kubenswrapper[4574]: I1004 04:47:44.987719 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dmzfp" podStartSLOduration=67.987701465 podStartE2EDuration="1m7.987701465s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.972815107 +0000 UTC m=+90.826958159" watchObservedRunningTime="2025-10-04 04:47:44.987701465 +0000 UTC m=+90.841844507" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.010978 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b9dlv" podStartSLOduration=68.01096328 podStartE2EDuration="1m8.01096328s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:44.98820063 +0000 UTC m=+90.842343692" watchObservedRunningTime="2025-10-04 04:47:45.01096328 +0000 UTC m=+90.865106322" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.027156 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.027208 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.027329 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.027344 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.027352 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.129589 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.129622 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.129631 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.129644 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.129653 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.231758 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.231810 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.231823 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.231839 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.231851 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.334653 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.334687 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.334697 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.334713 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.334723 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.436955 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.437182 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.437296 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.437427 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.437504 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.540221 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.540280 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.540291 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.540305 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.540314 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.642556 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.642603 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.642622 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.642645 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.642663 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.732938 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.732944 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:45 crc kubenswrapper[4574]: E1004 04:47:45.733131 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.732958 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.732950 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:45 crc kubenswrapper[4574]: E1004 04:47:45.733189 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:45 crc kubenswrapper[4574]: E1004 04:47:45.733052 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:45 crc kubenswrapper[4574]: E1004 04:47:45.733222 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.744932 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.745103 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.745187 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.745290 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.745383 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.847061 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.847095 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.847103 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.847116 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.847124 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.949828 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.950110 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.950219 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.950339 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:45 crc kubenswrapper[4574]: I1004 04:47:45.950492 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:45Z","lastTransitionTime":"2025-10-04T04:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.052987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.053276 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.053371 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.053466 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.053542 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.155707 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.155750 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.155758 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.155777 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.155786 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.258180 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.258212 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.258221 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.258250 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.258262 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.360144 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.360197 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.360205 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.360380 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.360391 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.462421 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.462446 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.462453 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.462467 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.462475 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.565329 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.565372 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.565385 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.565400 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.565409 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.667804 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.667851 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.667868 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.667882 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.667894 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.735019 4574 scope.go:117] "RemoveContainer" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" Oct 04 04:47:46 crc kubenswrapper[4574]: E1004 04:47:46.735160 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.769677 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.769703 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.769711 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.769722 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.769731 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.872163 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.872204 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.872215 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.872335 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.872347 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.975045 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.975092 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.975110 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.975133 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:46 crc kubenswrapper[4574]: I1004 04:47:46.975155 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:46Z","lastTransitionTime":"2025-10-04T04:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.077888 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.077932 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.077945 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.077960 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.077976 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.181066 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.181121 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.181132 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.181151 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.181528 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.284310 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.284343 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.284354 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.284370 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.284379 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.386746 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.386801 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.386815 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.386831 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.386843 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.489657 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.489701 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.489710 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.489723 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.489736 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.592389 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.592428 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.592439 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.592453 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.592464 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.694893 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.694926 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.694935 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.694949 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.694958 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.732168 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.732197 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.732188 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:47 crc kubenswrapper[4574]: E1004 04:47:47.732310 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.732166 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:47 crc kubenswrapper[4574]: E1004 04:47:47.732454 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:47 crc kubenswrapper[4574]: E1004 04:47:47.732482 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:47 crc kubenswrapper[4574]: E1004 04:47:47.732543 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.797503 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.797817 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.797829 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.797844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.797855 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.900744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.900788 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.900798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.900816 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:47 crc kubenswrapper[4574]: I1004 04:47:47.900829 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:47Z","lastTransitionTime":"2025-10-04T04:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.003981 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.004029 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.004043 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.004061 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.004073 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.107397 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.107443 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.107454 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.107475 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.107489 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.210842 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.210896 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.210911 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.210931 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.210944 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.314044 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.314092 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.314102 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.314121 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.314135 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.417049 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.417104 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.417117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.417141 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.417162 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.519724 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.519800 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.519816 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.519844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.519861 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.623252 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.623307 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.623317 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.623333 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.623346 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.725161 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.725200 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.725209 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.725224 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.725248 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.827742 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.827787 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.827797 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.827856 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.827865 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.930328 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.930438 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.930449 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.930462 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:48 crc kubenswrapper[4574]: I1004 04:47:48.930471 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:48Z","lastTransitionTime":"2025-10-04T04:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.032853 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.032889 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.032898 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.032911 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.032920 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.135403 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.135455 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.135465 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.135481 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.135491 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.238354 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.238391 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.238402 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.238417 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.238427 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.340837 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.340936 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.340961 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.341027 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.341042 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.443476 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.443535 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.443549 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.443566 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.443578 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.546383 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.546450 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.546478 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.546493 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.546503 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.648813 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.648845 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.648855 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.648869 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.648878 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.732280 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.732314 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.732338 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.732280 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:49 crc kubenswrapper[4574]: E1004 04:47:49.732476 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:49 crc kubenswrapper[4574]: E1004 04:47:49.732578 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:49 crc kubenswrapper[4574]: E1004 04:47:49.732852 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:49 crc kubenswrapper[4574]: E1004 04:47:49.732973 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.751701 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.751747 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.751756 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.751775 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.751791 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.854204 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.854261 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.854272 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.854287 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.854300 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.957755 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.957815 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.957832 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.957857 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:49 crc kubenswrapper[4574]: I1004 04:47:49.957879 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:49Z","lastTransitionTime":"2025-10-04T04:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.061015 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.061078 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.061091 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.061117 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.061136 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.163966 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.164007 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.164018 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.164035 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.164045 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.266697 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.266744 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.266757 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.266777 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.266788 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.368679 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.368741 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.368751 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.368767 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.368776 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.471863 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.471973 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.471987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.472008 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.472021 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.579162 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.579242 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.579256 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.579277 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.579299 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.617809 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.619275 4574 scope.go:117] "RemoveContainer" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" Oct 04 04:47:50 crc kubenswrapper[4574]: E1004 04:47:50.619477 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.682028 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.682067 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.682076 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.682090 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.682101 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.785621 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.785720 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.785737 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.785763 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.785779 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.888920 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.888987 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.889004 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.889031 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.889047 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.992112 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.992157 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.992169 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.992185 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:50 crc kubenswrapper[4574]: I1004 04:47:50.992195 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:50Z","lastTransitionTime":"2025-10-04T04:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.094735 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.094775 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.094784 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.094798 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.094808 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:51Z","lastTransitionTime":"2025-10-04T04:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.199193 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.199489 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.199639 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.199791 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.199950 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:51Z","lastTransitionTime":"2025-10-04T04:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.303261 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.303376 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.303388 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.303405 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.303417 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:51Z","lastTransitionTime":"2025-10-04T04:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.344372 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.344639 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.344751 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.344844 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.344962 4574 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-04T04:47:51Z","lastTransitionTime":"2025-10-04T04:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.390998 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt"] Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.391401 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.393756 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.393919 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.394104 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.394214 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.499182 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/70333db1-9c17-4c5b-936b-e08cbc1a2860-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.499376 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70333db1-9c17-4c5b-936b-e08cbc1a2860-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.499475 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70333db1-9c17-4c5b-936b-e08cbc1a2860-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.499526 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70333db1-9c17-4c5b-936b-e08cbc1a2860-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.499560 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/70333db1-9c17-4c5b-936b-e08cbc1a2860-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600218 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70333db1-9c17-4c5b-936b-e08cbc1a2860-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600316 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/70333db1-9c17-4c5b-936b-e08cbc1a2860-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600343 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/70333db1-9c17-4c5b-936b-e08cbc1a2860-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600373 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70333db1-9c17-4c5b-936b-e08cbc1a2860-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600427 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70333db1-9c17-4c5b-936b-e08cbc1a2860-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600892 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/70333db1-9c17-4c5b-936b-e08cbc1a2860-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.600942 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/70333db1-9c17-4c5b-936b-e08cbc1a2860-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.601335 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/70333db1-9c17-4c5b-936b-e08cbc1a2860-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.608767 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70333db1-9c17-4c5b-936b-e08cbc1a2860-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.620711 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70333db1-9c17-4c5b-936b-e08cbc1a2860-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cxlqt\" (UID: \"70333db1-9c17-4c5b-936b-e08cbc1a2860\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.708582 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.732468 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.732504 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:51 crc kubenswrapper[4574]: E1004 04:47:51.732577 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.732468 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:51 crc kubenswrapper[4574]: E1004 04:47:51.732655 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:51 crc kubenswrapper[4574]: I1004 04:47:51.732740 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:51 crc kubenswrapper[4574]: E1004 04:47:51.732741 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:51 crc kubenswrapper[4574]: E1004 04:47:51.732921 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:52 crc kubenswrapper[4574]: I1004 04:47:52.161282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" event={"ID":"70333db1-9c17-4c5b-936b-e08cbc1a2860","Type":"ContainerStarted","Data":"aaef35f81ec1b1723daeba5a5ddd3af32858ac0173f751573b33f061abcec71b"} Oct 04 04:47:52 crc kubenswrapper[4574]: I1004 04:47:52.161376 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" event={"ID":"70333db1-9c17-4c5b-936b-e08cbc1a2860","Type":"ContainerStarted","Data":"b443010e9793b08edcfa599746dfea831355b49d6ffa8b94f730b8054ae2120a"} Oct 04 04:47:53 crc kubenswrapper[4574]: I1004 04:47:53.732950 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:53 crc kubenswrapper[4574]: I1004 04:47:53.734468 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:53 crc kubenswrapper[4574]: I1004 04:47:53.734556 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:53 crc kubenswrapper[4574]: E1004 04:47:53.734868 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:53 crc kubenswrapper[4574]: I1004 04:47:53.735341 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:53 crc kubenswrapper[4574]: E1004 04:47:53.735505 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:53 crc kubenswrapper[4574]: E1004 04:47:53.735589 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:53 crc kubenswrapper[4574]: E1004 04:47:53.736041 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:55 crc kubenswrapper[4574]: I1004 04:47:55.543436 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:55 crc kubenswrapper[4574]: E1004 04:47:55.543709 4574 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:47:55 crc kubenswrapper[4574]: E1004 04:47:55.544542 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs podName:833018b5-b584-4e77-a95f-fe56f6dd5945 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:59.544517554 +0000 UTC m=+165.398660586 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs") pod "network-metrics-daemon-stmq5" (UID: "833018b5-b584-4e77-a95f-fe56f6dd5945") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 04 04:47:55 crc kubenswrapper[4574]: I1004 04:47:55.732649 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:55 crc kubenswrapper[4574]: I1004 04:47:55.732740 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:55 crc kubenswrapper[4574]: I1004 04:47:55.732649 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:55 crc kubenswrapper[4574]: I1004 04:47:55.732676 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:55 crc kubenswrapper[4574]: E1004 04:47:55.732831 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:55 crc kubenswrapper[4574]: E1004 04:47:55.732885 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:55 crc kubenswrapper[4574]: E1004 04:47:55.732969 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:55 crc kubenswrapper[4574]: E1004 04:47:55.733037 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:57 crc kubenswrapper[4574]: I1004 04:47:57.733120 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:57 crc kubenswrapper[4574]: E1004 04:47:57.733569 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:57 crc kubenswrapper[4574]: I1004 04:47:57.733136 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:57 crc kubenswrapper[4574]: E1004 04:47:57.733663 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:57 crc kubenswrapper[4574]: I1004 04:47:57.733267 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:57 crc kubenswrapper[4574]: I1004 04:47:57.733125 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:57 crc kubenswrapper[4574]: E1004 04:47:57.733884 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:47:57 crc kubenswrapper[4574]: E1004 04:47:57.733916 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:59 crc kubenswrapper[4574]: I1004 04:47:59.746106 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:47:59 crc kubenswrapper[4574]: E1004 04:47:59.746298 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:47:59 crc kubenswrapper[4574]: I1004 04:47:59.746485 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:47:59 crc kubenswrapper[4574]: I1004 04:47:59.746495 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:47:59 crc kubenswrapper[4574]: I1004 04:47:59.746523 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:47:59 crc kubenswrapper[4574]: E1004 04:47:59.746576 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:47:59 crc kubenswrapper[4574]: E1004 04:47:59.746709 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:47:59 crc kubenswrapper[4574]: E1004 04:47:59.746756 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:01 crc kubenswrapper[4574]: I1004 04:48:01.732681 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:01 crc kubenswrapper[4574]: I1004 04:48:01.732683 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:01 crc kubenswrapper[4574]: I1004 04:48:01.732803 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:01 crc kubenswrapper[4574]: I1004 04:48:01.733467 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:01 crc kubenswrapper[4574]: E1004 04:48:01.733735 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:01 crc kubenswrapper[4574]: E1004 04:48:01.734148 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:01 crc kubenswrapper[4574]: E1004 04:48:01.734223 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:01 crc kubenswrapper[4574]: E1004 04:48:01.734072 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:03 crc kubenswrapper[4574]: I1004 04:48:03.732262 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:03 crc kubenswrapper[4574]: I1004 04:48:03.732262 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:03 crc kubenswrapper[4574]: E1004 04:48:03.732398 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:03 crc kubenswrapper[4574]: I1004 04:48:03.732409 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:03 crc kubenswrapper[4574]: E1004 04:48:03.732464 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:03 crc kubenswrapper[4574]: I1004 04:48:03.732480 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:03 crc kubenswrapper[4574]: E1004 04:48:03.732534 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:03 crc kubenswrapper[4574]: E1004 04:48:03.732611 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:05 crc kubenswrapper[4574]: I1004 04:48:05.732960 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:05 crc kubenswrapper[4574]: I1004 04:48:05.732985 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:05 crc kubenswrapper[4574]: I1004 04:48:05.733006 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:05 crc kubenswrapper[4574]: E1004 04:48:05.733064 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:05 crc kubenswrapper[4574]: I1004 04:48:05.732973 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:05 crc kubenswrapper[4574]: E1004 04:48:05.733172 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:05 crc kubenswrapper[4574]: E1004 04:48:05.733440 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:05 crc kubenswrapper[4574]: E1004 04:48:05.733545 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:05 crc kubenswrapper[4574]: I1004 04:48:05.733748 4574 scope.go:117] "RemoveContainer" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" Oct 04 04:48:05 crc kubenswrapper[4574]: E1004 04:48:05.733890 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ntdng_openshift-ovn-kubernetes(e473790c-4fad-4637-9d72-0dd6310b4ae0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" Oct 04 04:48:07 crc kubenswrapper[4574]: I1004 04:48:07.732983 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:07 crc kubenswrapper[4574]: I1004 04:48:07.733088 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:07 crc kubenswrapper[4574]: I1004 04:48:07.732998 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:07 crc kubenswrapper[4574]: I1004 04:48:07.732986 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:07 crc kubenswrapper[4574]: E1004 04:48:07.733171 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:07 crc kubenswrapper[4574]: E1004 04:48:07.733264 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:07 crc kubenswrapper[4574]: E1004 04:48:07.733369 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:07 crc kubenswrapper[4574]: E1004 04:48:07.733477 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:09 crc kubenswrapper[4574]: I1004 04:48:09.732148 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:09 crc kubenswrapper[4574]: I1004 04:48:09.732148 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:09 crc kubenswrapper[4574]: I1004 04:48:09.732148 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:09 crc kubenswrapper[4574]: E1004 04:48:09.732477 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:09 crc kubenswrapper[4574]: E1004 04:48:09.732290 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:09 crc kubenswrapper[4574]: E1004 04:48:09.732489 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:09 crc kubenswrapper[4574]: I1004 04:48:09.732162 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:09 crc kubenswrapper[4574]: E1004 04:48:09.732594 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:11 crc kubenswrapper[4574]: I1004 04:48:11.732174 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:11 crc kubenswrapper[4574]: I1004 04:48:11.732221 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:11 crc kubenswrapper[4574]: I1004 04:48:11.732284 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:11 crc kubenswrapper[4574]: I1004 04:48:11.732225 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:11 crc kubenswrapper[4574]: E1004 04:48:11.732366 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:11 crc kubenswrapper[4574]: E1004 04:48:11.732441 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:11 crc kubenswrapper[4574]: E1004 04:48:11.732535 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:11 crc kubenswrapper[4574]: E1004 04:48:11.732613 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.221166 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/1.log" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.222040 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/0.log" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.222170 4574 generic.go:334] "Generic (PLEG): container finished" podID="649982aa-c9c5-41ce-a056-48ad058e9aa5" containerID="231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707" exitCode=1 Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.222268 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerDied","Data":"231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707"} Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.222421 4574 scope.go:117] "RemoveContainer" containerID="c030f20450cb890f1de0863ae0497e515723144055a797306f503b40d1701e9b" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.222787 4574 scope.go:117] "RemoveContainer" containerID="231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707" Oct 04 04:48:13 crc kubenswrapper[4574]: E1004 04:48:13.222946 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6wsfn_openshift-multus(649982aa-c9c5-41ce-a056-48ad058e9aa5)\"" pod="openshift-multus/multus-6wsfn" podUID="649982aa-c9c5-41ce-a056-48ad058e9aa5" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.241220 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cxlqt" podStartSLOduration=96.241201616 podStartE2EDuration="1m36.241201616s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:47:52.184742192 +0000 UTC m=+98.038885234" watchObservedRunningTime="2025-10-04 04:48:13.241201616 +0000 UTC m=+119.095344658" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.732314 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:13 crc kubenswrapper[4574]: E1004 04:48:13.732448 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.732315 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.732316 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:13 crc kubenswrapper[4574]: E1004 04:48:13.732525 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:13 crc kubenswrapper[4574]: I1004 04:48:13.732334 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:13 crc kubenswrapper[4574]: E1004 04:48:13.732588 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:13 crc kubenswrapper[4574]: E1004 04:48:13.732643 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:14 crc kubenswrapper[4574]: I1004 04:48:14.226436 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/1.log" Oct 04 04:48:14 crc kubenswrapper[4574]: E1004 04:48:14.738892 4574 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 04 04:48:14 crc kubenswrapper[4574]: E1004 04:48:14.806090 4574 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 04 04:48:15 crc kubenswrapper[4574]: I1004 04:48:15.732888 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:15 crc kubenswrapper[4574]: I1004 04:48:15.732943 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:15 crc kubenswrapper[4574]: E1004 04:48:15.733007 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:15 crc kubenswrapper[4574]: I1004 04:48:15.732942 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:15 crc kubenswrapper[4574]: E1004 04:48:15.733077 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:15 crc kubenswrapper[4574]: E1004 04:48:15.733155 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:15 crc kubenswrapper[4574]: I1004 04:48:15.733913 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:15 crc kubenswrapper[4574]: E1004 04:48:15.734047 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:17 crc kubenswrapper[4574]: I1004 04:48:17.733107 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:17 crc kubenswrapper[4574]: I1004 04:48:17.733178 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:17 crc kubenswrapper[4574]: E1004 04:48:17.733319 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:17 crc kubenswrapper[4574]: I1004 04:48:17.733406 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:17 crc kubenswrapper[4574]: I1004 04:48:17.733454 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:17 crc kubenswrapper[4574]: E1004 04:48:17.733458 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:17 crc kubenswrapper[4574]: E1004 04:48:17.733666 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:17 crc kubenswrapper[4574]: E1004 04:48:17.733753 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:17 crc kubenswrapper[4574]: I1004 04:48:17.734430 4574 scope.go:117] "RemoveContainer" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" Oct 04 04:48:18 crc kubenswrapper[4574]: I1004 04:48:18.239586 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/3.log" Oct 04 04:48:18 crc kubenswrapper[4574]: I1004 04:48:18.243642 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerStarted","Data":"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112"} Oct 04 04:48:18 crc kubenswrapper[4574]: I1004 04:48:18.244085 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:48:18 crc kubenswrapper[4574]: I1004 04:48:18.272990 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podStartSLOduration=101.272962898 podStartE2EDuration="1m41.272962898s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:18.271120724 +0000 UTC m=+124.125263776" watchObservedRunningTime="2025-10-04 04:48:18.272962898 +0000 UTC m=+124.127105940" Oct 04 04:48:18 crc kubenswrapper[4574]: I1004 04:48:18.603101 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-stmq5"] Oct 04 04:48:18 crc kubenswrapper[4574]: I1004 04:48:18.603231 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:18 crc kubenswrapper[4574]: E1004 04:48:18.603344 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:19 crc kubenswrapper[4574]: I1004 04:48:19.733033 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:19 crc kubenswrapper[4574]: I1004 04:48:19.733072 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:19 crc kubenswrapper[4574]: I1004 04:48:19.733141 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:19 crc kubenswrapper[4574]: E1004 04:48:19.733265 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:19 crc kubenswrapper[4574]: E1004 04:48:19.733362 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:19 crc kubenswrapper[4574]: E1004 04:48:19.733467 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:19 crc kubenswrapper[4574]: E1004 04:48:19.807938 4574 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 04 04:48:20 crc kubenswrapper[4574]: I1004 04:48:20.733419 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:20 crc kubenswrapper[4574]: E1004 04:48:20.733631 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:21 crc kubenswrapper[4574]: I1004 04:48:21.732818 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:21 crc kubenswrapper[4574]: I1004 04:48:21.732868 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:21 crc kubenswrapper[4574]: I1004 04:48:21.732872 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:21 crc kubenswrapper[4574]: E1004 04:48:21.732993 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:21 crc kubenswrapper[4574]: E1004 04:48:21.733087 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:21 crc kubenswrapper[4574]: E1004 04:48:21.733169 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:22 crc kubenswrapper[4574]: I1004 04:48:22.732858 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:22 crc kubenswrapper[4574]: E1004 04:48:22.733368 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:23 crc kubenswrapper[4574]: I1004 04:48:23.732246 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:23 crc kubenswrapper[4574]: I1004 04:48:23.732249 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:23 crc kubenswrapper[4574]: I1004 04:48:23.732342 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:23 crc kubenswrapper[4574]: E1004 04:48:23.732449 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:23 crc kubenswrapper[4574]: E1004 04:48:23.732592 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:23 crc kubenswrapper[4574]: E1004 04:48:23.732648 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:24 crc kubenswrapper[4574]: I1004 04:48:24.735600 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:24 crc kubenswrapper[4574]: E1004 04:48:24.736716 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:24 crc kubenswrapper[4574]: E1004 04:48:24.808402 4574 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 04 04:48:25 crc kubenswrapper[4574]: I1004 04:48:25.732347 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:25 crc kubenswrapper[4574]: I1004 04:48:25.732348 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:25 crc kubenswrapper[4574]: E1004 04:48:25.732595 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:25 crc kubenswrapper[4574]: I1004 04:48:25.732378 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:25 crc kubenswrapper[4574]: E1004 04:48:25.732797 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:25 crc kubenswrapper[4574]: E1004 04:48:25.732911 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:26 crc kubenswrapper[4574]: I1004 04:48:26.733200 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:26 crc kubenswrapper[4574]: E1004 04:48:26.733376 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:27 crc kubenswrapper[4574]: I1004 04:48:27.732313 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:27 crc kubenswrapper[4574]: I1004 04:48:27.732351 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:27 crc kubenswrapper[4574]: I1004 04:48:27.732430 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:27 crc kubenswrapper[4574]: E1004 04:48:27.732933 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:27 crc kubenswrapper[4574]: I1004 04:48:27.732967 4574 scope.go:117] "RemoveContainer" containerID="231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707" Oct 04 04:48:27 crc kubenswrapper[4574]: E1004 04:48:27.733049 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:27 crc kubenswrapper[4574]: E1004 04:48:27.733116 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:28 crc kubenswrapper[4574]: I1004 04:48:28.278195 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/1.log" Oct 04 04:48:28 crc kubenswrapper[4574]: I1004 04:48:28.278281 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerStarted","Data":"71122238bbcb5eb7ac6d2b213c66c02622792750a759fbf70a6697d445b3535f"} Oct 04 04:48:28 crc kubenswrapper[4574]: I1004 04:48:28.735070 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:28 crc kubenswrapper[4574]: E1004 04:48:28.735197 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-stmq5" podUID="833018b5-b584-4e77-a95f-fe56f6dd5945" Oct 04 04:48:29 crc kubenswrapper[4574]: I1004 04:48:29.732819 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:29 crc kubenswrapper[4574]: I1004 04:48:29.732892 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:29 crc kubenswrapper[4574]: E1004 04:48:29.732943 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 04 04:48:29 crc kubenswrapper[4574]: E1004 04:48:29.733027 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 04 04:48:29 crc kubenswrapper[4574]: I1004 04:48:29.734156 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:29 crc kubenswrapper[4574]: E1004 04:48:29.734479 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 04 04:48:30 crc kubenswrapper[4574]: I1004 04:48:30.733292 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:30 crc kubenswrapper[4574]: I1004 04:48:30.736462 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 04 04:48:30 crc kubenswrapper[4574]: I1004 04:48:30.737939 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.732150 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.732158 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.732150 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.733809 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.734403 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.734677 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 04 04:48:31 crc kubenswrapper[4574]: I1004 04:48:31.734787 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.249749 4574 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.297087 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ngf8v"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.297456 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.299747 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k52jj"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.300119 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.300491 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.301029 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.305264 4574 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.305308 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.305385 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.306948 4574 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.306977 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.307022 4574 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.307033 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.307068 4574 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.307081 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.307088 4574 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.307108 4574 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.307122 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.307133 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.308841 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.309435 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.309781 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ms6sm"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.311971 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.312029 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.312870 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.313049 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qbwcp"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.313524 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.330474 4574 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.330516 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.330609 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.332085 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.332947 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.333010 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.333376 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.333453 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.333589 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.333868 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.333903 4574 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.333924 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.334062 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334176 4574 reflector.go:561] object-"openshift-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334198 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.334371 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334408 4574 reflector.go:561] object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z": failed to list *v1.Secret: secrets "openshift-config-operator-dockercfg-7pc5z" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334455 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-7pc5z\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-config-operator-dockercfg-7pc5z\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334478 4574 reflector.go:561] object-"openshift-config-operator"/"config-operator-serving-cert": failed to list *v1.Secret: secrets "config-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334503 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"config-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"config-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334518 4574 reflector.go:561] object-"openshift-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334538 4574 reflector.go:561] object-"openshift-console-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334551 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334536 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334584 4574 reflector.go:561] object-"openshift-console-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334594 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.334613 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.334634 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.334751 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 04 04:48:32 crc kubenswrapper[4574]: W1004 04:48:32.334828 4574 reflector.go:561] object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr": failed to list *v1.Secret: secrets "console-operator-dockercfg-4xjcr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.334849 4574 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4xjcr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-operator-dockercfg-4xjcr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.337178 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.337620 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.338068 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.338712 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.339011 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.339125 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.339670 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.343350 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-l8x2m"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.343502 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.350849 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-44hzk"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.351651 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352067 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlz45\" (UniqueName: \"kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352160 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pqf\" (UniqueName: \"kubernetes.io/projected/699add67-bf01-4799-80ff-615e4ea6da01-kube-api-access-m7pqf\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352289 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a5a14-3672-401d-8cbe-df1c5b4081be-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352400 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/699add67-bf01-4799-80ff-615e4ea6da01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352547 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rb475"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352615 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352691 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkfj\" (UniqueName: \"kubernetes.io/projected/d6a6950f-165f-4419-8f44-4e65c42c51b4-kube-api-access-tvkfj\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.352795 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4f8t\" (UniqueName: \"kubernetes.io/projected/676a5a14-3672-401d-8cbe-df1c5b4081be-kube-api-access-j4f8t\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.367530 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.368211 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.368832 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hkp92"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.369256 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.369733 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.370012 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.370280 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.371662 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372079 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372184 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372106 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pcvkf"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372895 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372948 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372967 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a6950f-165f-4419-8f44-4e65c42c51b4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.372985 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6a6950f-165f-4419-8f44-4e65c42c51b4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373014 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a5a14-3672-401d-8cbe-df1c5b4081be-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373018 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373031 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373049 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-config\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373064 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-serving-cert\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373099 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g699f\" (UniqueName: \"kubernetes.io/projected/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-kube-api-access-g699f\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373119 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a6950f-165f-4419-8f44-4e65c42c51b4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373140 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373365 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373814 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.373898 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.374549 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2nmbr"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.374942 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.380651 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wwzd7"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.381098 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.382985 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.397607 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.408913 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.409195 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.410000 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.411642 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.412342 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.412599 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.412708 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.413114 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j6jgh"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.413264 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.413670 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.413771 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.413897 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.413968 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.414352 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.415889 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.416550 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.417223 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.417674 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.417796 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.418748 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.418932 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.418953 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.419051 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.419055 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.419319 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.421481 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ms27n"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.422195 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.424067 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.424483 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.432938 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.432962 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.433253 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.433509 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.433872 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434107 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434288 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434363 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434394 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.433186 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434684 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434711 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434638 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.434967 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.435076 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.435144 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.435214 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.435324 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.435285 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.436094 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.436256 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.455498 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.456293 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.456629 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.457079 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.457440 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.457785 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.460158 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.460294 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.460563 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.460712 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.460829 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.460945 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.461055 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.461177 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.462615 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.470358 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.470485 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.470552 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.470610 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.470663 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.470710 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.471178 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.471808 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.471928 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.472029 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.474353 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a6950f-165f-4419-8f44-4e65c42c51b4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.474535 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6a6950f-165f-4419-8f44-4e65c42c51b4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.474645 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a5a14-3672-401d-8cbe-df1c5b4081be-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.474760 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.474846 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-config\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.474981 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-serving-cert\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475092 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g699f\" (UniqueName: \"kubernetes.io/projected/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-kube-api-access-g699f\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475226 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a6950f-165f-4419-8f44-4e65c42c51b4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475441 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475558 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlz45\" (UniqueName: \"kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475670 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pqf\" (UniqueName: \"kubernetes.io/projected/699add67-bf01-4799-80ff-615e4ea6da01-kube-api-access-m7pqf\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475772 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a5a14-3672-401d-8cbe-df1c5b4081be-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.475897 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/699add67-bf01-4799-80ff-615e4ea6da01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476015 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476131 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkfj\" (UniqueName: \"kubernetes.io/projected/d6a6950f-165f-4419-8f44-4e65c42c51b4-kube-api-access-tvkfj\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476258 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4f8t\" (UniqueName: \"kubernetes.io/projected/676a5a14-3672-401d-8cbe-df1c5b4081be-kube-api-access-j4f8t\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476384 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476480 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476589 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476685 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.476956 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.478811 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.479538 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.482870 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.483628 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/699add67-bf01-4799-80ff-615e4ea6da01-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.483906 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a5a14-3672-401d-8cbe-df1c5b4081be-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.484364 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-config\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.484551 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.484803 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.484945 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.485037 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.488786 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.493105 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tn7qm"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.493534 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.493835 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.494076 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.496578 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a5a14-3672-401d-8cbe-df1c5b4081be-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.497016 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6a6950f-165f-4419-8f44-4e65c42c51b4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.500899 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.501012 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.501826 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.502211 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.502664 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-serving-cert\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.502971 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.503163 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7sktt"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.503867 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.504033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6a6950f-165f-4419-8f44-4e65c42c51b4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.506487 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.508887 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wpz6v"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.509616 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.509836 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.512017 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.512553 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.513062 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.513637 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.513877 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.514039 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.522848 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hwfs9"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.524025 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.528732 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.529401 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.528683 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.529786 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ngf8v"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.529848 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.529853 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.541856 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.543259 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.544680 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.546808 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k52jj"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.551684 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.552411 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qbwcp"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.558722 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.565332 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.568398 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.577860 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.579732 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.581461 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hkp92"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.584398 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kv6r9"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.586766 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.590160 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-l8x2m"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.590297 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-44hzk"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.592273 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j6jgh"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.593261 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.594948 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wpz6v"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.595819 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7sktt"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.597077 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.598896 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.599621 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.601339 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.603458 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.605761 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ms6sm"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.607548 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hwfs9"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.609394 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.610936 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pcvkf"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.612588 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.613735 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2qqsp"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.614779 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.616030 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.619627 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tn7qm"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.621227 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wwzd7"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.621801 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.623321 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zxrvh"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.624565 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.625496 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.627087 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.628996 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2nmbr"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.630190 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2qqsp"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.632306 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.633528 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kv6r9"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.634649 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.635915 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.637317 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.638214 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zxrvh"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.639543 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.639703 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.641348 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g2zvs"] Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.641919 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.659747 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.680308 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.700187 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.739903 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.761124 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.780604 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781399 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-serving-cert\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781466 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-config\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781498 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-trusted-ca\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781521 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-certificates\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781542 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-client-ca\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781564 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rv2\" (UniqueName: \"kubernetes.io/projected/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-kube-api-access-66rv2\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781580 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-tls\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781870 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781905 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8tgg\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-kube-api-access-r8tgg\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781922 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-config\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781945 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-trusted-ca\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781963 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-config\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.781984 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3111436-b5d8-405e-ab14-2fb33bd107c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782034 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050c17bb-6aa3-49bd-a875-c2088ffd1799-config\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782060 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050c17bb-6aa3-49bd-a875-c2088ffd1799-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782079 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782100 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-479vz\" (UniqueName: \"kubernetes.io/projected/f3111436-b5d8-405e-ab14-2fb33bd107c0-kube-api-access-479vz\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782137 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-bound-sa-token\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782154 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782180 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782223 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.782255 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/050c17bb-6aa3-49bd-a875-c2088ffd1799-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.785479 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.28545332 +0000 UTC m=+139.139596352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.800484 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.819279 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.839456 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.860051 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.883453 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.884647 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.384596104 +0000 UTC m=+139.238739156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.885800 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4eab433d-51c6-4d3e-8c47-329eb8b06c52-proxy-tls\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886437 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbfst\" (UniqueName: \"kubernetes.io/projected/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-kube-api-access-wbfst\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886471 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/654c3af8-4315-43f4-aedf-366422a88358-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886529 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886552 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-audit-dir\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886601 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886623 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31ed34c-4127-4040-91fb-c53b671f9ab5-secret-volume\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886660 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-479vz\" (UniqueName: \"kubernetes.io/projected/f3111436-b5d8-405e-ab14-2fb33bd107c0-kube-api-access-479vz\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886681 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eadb650-b9f5-4f66-a038-0a381546b35d-serving-cert\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886702 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-signing-cabundle\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886737 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-encryption-config\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886757 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlbms\" (UniqueName: \"kubernetes.io/projected/9b429dea-1750-4927-a2bb-9ca8f00c4083-kube-api-access-vlbms\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886776 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq6kw\" (UniqueName: \"kubernetes.io/projected/69b2231e-4f54-4554-8e7a-d46e644d6b81-kube-api-access-fq6kw\") pod \"downloads-7954f5f757-2nmbr\" (UID: \"69b2231e-4f54-4554-8e7a-d46e644d6b81\") " pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886817 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-default-certificate\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886852 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rzx8\" (UniqueName: \"kubernetes.io/projected/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-kube-api-access-4rzx8\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886873 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886914 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtt5\" (UniqueName: \"kubernetes.io/projected/d5441c46-ed2e-45d3-8cab-6493dd503085-kube-api-access-fvtt5\") pod \"multus-admission-controller-857f4d67dd-wpz6v\" (UID: \"d5441c46-ed2e-45d3-8cab-6493dd503085\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886936 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886971 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.886994 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlzqt\" (UniqueName: \"kubernetes.io/projected/78904868-f0f9-4198-ac3a-130af7060c38-kube-api-access-jlzqt\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887021 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-certs\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887044 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887087 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-serving-cert\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887135 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887156 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-csi-data-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887205 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1bc315a4-bf12-48d0-aa24-da64d82a31f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8xr5z\" (UID: \"1bc315a4-bf12-48d0-aa24-da64d82a31f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887227 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2b447ff0-4b72-429b-a255-bbd745131936-profile-collector-cert\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887264 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e7eb9d0-b927-442e-be78-72787f67986c-machine-approver-tls\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887284 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d74c2b-550a-43a4-858a-be942ffece17-service-ca-bundle\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887332 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tscb9\" (UniqueName: \"kubernetes.io/projected/35618a3f-3250-4767-892c-06d7cf99e0a9-kube-api-access-tscb9\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887362 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fqm\" (UniqueName: \"kubernetes.io/projected/f9d839f0-e881-471e-aaf6-a948bb298b17-kube-api-access-n9fqm\") pod \"dns-operator-744455d44c-j6jgh\" (UID: \"f9d839f0-e881-471e-aaf6-a948bb298b17\") " pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.887388 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-config\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.888384 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.388363814 +0000 UTC m=+139.242507056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.889771 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2b447ff0-4b72-429b-a255-bbd745131936-kube-api-access-7lq2r\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.889884 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78904868-f0f9-4198-ac3a-130af7060c38-apiservice-cert\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890060 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890120 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzpjl\" (UniqueName: \"kubernetes.io/projected/e31ed34c-4127-4040-91fb-c53b671f9ab5-kube-api-access-xzpjl\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890155 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e7eb9d0-b927-442e-be78-72787f67986c-auth-proxy-config\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890282 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/050c17bb-6aa3-49bd-a875-c2088ffd1799-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890321 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tscnx\" (UniqueName: \"kubernetes.io/projected/34e83d3a-faaf-4720-85d2-1430c65810fd-kube-api-access-tscnx\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890359 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00c73e-dcd3-4fb7-aedd-77c84ea82855-config\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890397 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcw9f\" (UniqueName: \"kubernetes.io/projected/da00c73e-dcd3-4fb7-aedd-77c84ea82855-kube-api-access-xcw9f\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890427 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890464 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-serving-cert\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890492 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxrtq\" (UniqueName: \"kubernetes.io/projected/87ef4dec-e273-41a2-96de-6c9cc05122d2-kube-api-access-nxrtq\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890518 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890560 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-etcd-client\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890598 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-serving-cert\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890627 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjgf\" (UniqueName: \"kubernetes.io/projected/1a671e58-ffed-46d3-ae24-460febf09dea-kube-api-access-mxjgf\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890634 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890661 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-client-ca\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890726 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rv2\" (UniqueName: \"kubernetes.io/projected/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-kube-api-access-66rv2\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890764 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvn9g\" (UniqueName: \"kubernetes.io/projected/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-kube-api-access-wvn9g\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890789 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-proxy-tls\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890816 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890844 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-image-import-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890909 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da00c73e-dcd3-4fb7-aedd-77c84ea82855-images\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.890937 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-trusted-ca\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891044 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891080 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-plugins-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891106 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-signing-key\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891134 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-encryption-config\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891187 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7pbs\" (UniqueName: \"kubernetes.io/projected/4eab433d-51c6-4d3e-8c47-329eb8b06c52-kube-api-access-z7pbs\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891255 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-config\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891284 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891340 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-service-ca\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891385 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b429dea-1750-4927-a2bb-9ca8f00c4083-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891541 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f312b88c-5c97-446d-9d7b-e717ac2124fb-cert\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891591 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050c17bb-6aa3-49bd-a875-c2088ffd1799-config\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891628 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891660 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plvqb\" (UniqueName: \"kubernetes.io/projected/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-kube-api-access-plvqb\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891691 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-service-ca\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891758 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050c17bb-6aa3-49bd-a875-c2088ffd1799-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891817 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-node-pullsecrets\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.891848 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.892296 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.892557 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d5441c46-ed2e-45d3-8cab-6493dd503085-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wpz6v\" (UID: \"d5441c46-ed2e-45d3-8cab-6493dd503085\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.892824 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4c5\" (UniqueName: \"kubernetes.io/projected/c7efed8f-30b4-470a-9ee5-94f38ed51f37-kube-api-access-pt4c5\") pod \"package-server-manager-789f6589d5-xmxmq\" (UID: \"c7efed8f-30b4-470a-9ee5-94f38ed51f37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893043 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-serving-cert\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893166 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b429dea-1750-4927-a2bb-9ca8f00c4083-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893202 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893266 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a671e58-ffed-46d3-ae24-460febf09dea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893303 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2xk\" (UniqueName: \"kubernetes.io/projected/29aee87b-0598-4b50-9b1a-beacaf6d7275-kube-api-access-vf2xk\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893328 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/78904868-f0f9-4198-ac3a-130af7060c38-tmpfs\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893348 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4tlw\" (UniqueName: \"kubernetes.io/projected/d9424aaa-698a-43e0-ae1c-614cc4c538a6-kube-api-access-h4tlw\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893372 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050c17bb-6aa3-49bd-a875-c2088ffd1799-config\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893334 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-client-ca\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893382 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78904868-f0f9-4198-ac3a-130af7060c38-webhook-cert\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893436 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-metrics-certs\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893490 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31ed34c-4127-4040-91fb-c53b671f9ab5-config-volume\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893513 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893548 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893690 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-bound-sa-token\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893826 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-client\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893863 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtqxf\" (UniqueName: \"kubernetes.io/projected/1601fa84-c51f-451f-8538-6ee23ed108c1-kube-api-access-vtqxf\") pod \"migrator-59844c95c7-25vps\" (UID: \"1601fa84-c51f-451f-8538-6ee23ed108c1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.893939 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-policies\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894125 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-node-bootstrap-token\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894183 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-images\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894282 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-config\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894339 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtxfv\" (UniqueName: \"kubernetes.io/projected/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-kube-api-access-jtxfv\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894379 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-metrics-tls\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894405 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2b447ff0-4b72-429b-a255-bbd745131936-srv-cert\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894467 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-audit-policies\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894492 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-etcd-client\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894544 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4eab433d-51c6-4d3e-8c47-329eb8b06c52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894674 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-oauth-serving-cert\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894775 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-config\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894821 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894867 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5zc\" (UniqueName: \"kubernetes.io/projected/1bc315a4-bf12-48d0-aa24-da64d82a31f3-kube-api-access-fh5zc\") pod \"cluster-samples-operator-665b6dd947-8xr5z\" (UID: \"1bc315a4-bf12-48d0-aa24-da64d82a31f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.894902 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcw8\" (UniqueName: \"kubernetes.io/projected/f312b88c-5c97-446d-9d7b-e717ac2124fb-kube-api-access-llcw8\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895079 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7eb9d0-b927-442e-be78-72787f67986c-config\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895108 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4l5f\" (UniqueName: \"kubernetes.io/projected/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-kube-api-access-v4l5f\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895131 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895180 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-etcd-serving-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895585 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s59qx\" (UniqueName: \"kubernetes.io/projected/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-kube-api-access-s59qx\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895625 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da00c73e-dcd3-4fb7-aedd-77c84ea82855-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895677 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-srv-cert\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895705 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhb7f\" (UniqueName: \"kubernetes.io/projected/45d7e969-0ef5-4ba5-8259-09dbe9eec354-kube-api-access-hhb7f\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895739 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-serving-cert\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895830 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6e9d146-2d36-4313-9f02-2db06b5b5573-metrics-tls\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895854 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqj6\" (UniqueName: \"kubernetes.io/projected/b6e9d146-2d36-4313-9f02-2db06b5b5573-kube-api-access-pqqj6\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895903 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895924 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45d7e969-0ef5-4ba5-8259-09dbe9eec354-audit-dir\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895942 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-config\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895990 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-config\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.895897 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-config\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896009 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68zpb\" (UniqueName: \"kubernetes.io/projected/5eadb650-b9f5-4f66-a038-0a381546b35d-kube-api-access-68zpb\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896101 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrdp\" (UniqueName: \"kubernetes.io/projected/00d74c2b-550a-43a4-858a-be942ffece17-kube-api-access-kdrdp\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896137 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-audit\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896275 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9424aaa-698a-43e0-ae1c-614cc4c538a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896317 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-socket-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896358 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-trusted-ca\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896430 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896466 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-certificates\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896506 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896526 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a671e58-ffed-46d3-ae24-460febf09dea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896740 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-tls\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896757 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-dir\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896786 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf5tj\" (UniqueName: \"kubernetes.io/projected/2e7eb9d0-b927-442e-be78-72787f67986c-kube-api-access-lf5tj\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896804 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d839f0-e881-471e-aaf6-a948bb298b17-metrics-tls\") pod \"dns-operator-744455d44c-j6jgh\" (UID: \"f9d839f0-e881-471e-aaf6-a948bb298b17\") " pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896826 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-registration-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896843 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-ca\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896859 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-trusted-ca-bundle\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896875 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/654c3af8-4315-43f4-aedf-366422a88358-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896891 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7efed8f-30b4-470a-9ee5-94f38ed51f37-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xmxmq\" (UID: \"c7efed8f-30b4-470a-9ee5-94f38ed51f37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896913 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-mountpoint-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896928 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-stats-auth\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896944 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-oauth-config\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896970 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8tgg\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-kube-api-access-r8tgg\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.896985 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6e9d146-2d36-4313-9f02-2db06b5b5573-config-volume\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.897003 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-trusted-ca\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.897025 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.897087 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3111436-b5d8-405e-ab14-2fb33bd107c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.897109 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/654c3af8-4315-43f4-aedf-366422a88358-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.897137 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-config\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.897304 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050c17bb-6aa3-49bd-a875-c2088ffd1799-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.898318 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-trusted-ca\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.898745 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-config\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.898752 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-trusted-ca\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.898912 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-config\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.899063 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-certificates\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.899572 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.901074 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.903844 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-tls\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.903871 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.903844 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3111436-b5d8-405e-ab14-2fb33bd107c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.919806 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.940026 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.961370 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.981379 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.998365 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.998880 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: E1004 04:48:32.999006 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.49890145 +0000 UTC m=+139.353044492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999148 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a671e58-ffed-46d3-ae24-460febf09dea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999285 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-dir\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999387 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf5tj\" (UniqueName: \"kubernetes.io/projected/2e7eb9d0-b927-442e-be78-72787f67986c-kube-api-access-lf5tj\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999463 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d839f0-e881-471e-aaf6-a948bb298b17-metrics-tls\") pod \"dns-operator-744455d44c-j6jgh\" (UID: \"f9d839f0-e881-471e-aaf6-a948bb298b17\") " pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999541 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-registration-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999640 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-ca\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:32 crc kubenswrapper[4574]: I1004 04:48:32.999503 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:32.999408 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-dir\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:32.999744 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-trusted-ca-bundle\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000061 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-stats-auth\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000090 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-registration-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000094 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-oauth-config\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000332 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6e9d146-2d36-4313-9f02-2db06b5b5573-config-volume\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000391 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/654c3af8-4315-43f4-aedf-366422a88358-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000421 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7efed8f-30b4-470a-9ee5-94f38ed51f37-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xmxmq\" (UID: \"c7efed8f-30b4-470a-9ee5-94f38ed51f37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000445 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-mountpoint-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000443 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-ca\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000481 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000545 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/654c3af8-4315-43f4-aedf-366422a88358-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000569 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4eab433d-51c6-4d3e-8c47-329eb8b06c52-proxy-tls\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000591 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbfst\" (UniqueName: \"kubernetes.io/projected/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-kube-api-access-wbfst\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000624 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-mountpoint-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000637 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/654c3af8-4315-43f4-aedf-366422a88358-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000666 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000690 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-audit-dir\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000719 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31ed34c-4127-4040-91fb-c53b671f9ab5-secret-volume\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000751 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eadb650-b9f5-4f66-a038-0a381546b35d-serving-cert\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000779 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-signing-cabundle\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000812 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-encryption-config\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000833 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlbms\" (UniqueName: \"kubernetes.io/projected/9b429dea-1750-4927-a2bb-9ca8f00c4083-kube-api-access-vlbms\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000860 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-default-certificate\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000883 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq6kw\" (UniqueName: \"kubernetes.io/projected/69b2231e-4f54-4554-8e7a-d46e644d6b81-kube-api-access-fq6kw\") pod \"downloads-7954f5f757-2nmbr\" (UID: \"69b2231e-4f54-4554-8e7a-d46e644d6b81\") " pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000916 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rzx8\" (UniqueName: \"kubernetes.io/projected/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-kube-api-access-4rzx8\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000946 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.000978 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001028 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlzqt\" (UniqueName: \"kubernetes.io/projected/78904868-f0f9-4198-ac3a-130af7060c38-kube-api-access-jlzqt\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001051 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtt5\" (UniqueName: \"kubernetes.io/projected/d5441c46-ed2e-45d3-8cab-6493dd503085-kube-api-access-fvtt5\") pod \"multus-admission-controller-857f4d67dd-wpz6v\" (UID: \"d5441c46-ed2e-45d3-8cab-6493dd503085\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001279 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-certs\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001319 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001358 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001371 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001385 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-csi-data-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001437 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-serving-cert\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001497 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1bc315a4-bf12-48d0-aa24-da64d82a31f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8xr5z\" (UID: \"1bc315a4-bf12-48d0-aa24-da64d82a31f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001526 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2b447ff0-4b72-429b-a255-bbd745131936-profile-collector-cert\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001552 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e7eb9d0-b927-442e-be78-72787f67986c-machine-approver-tls\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001581 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d74c2b-550a-43a4-858a-be942ffece17-service-ca-bundle\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001614 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tscb9\" (UniqueName: \"kubernetes.io/projected/35618a3f-3250-4767-892c-06d7cf99e0a9-kube-api-access-tscb9\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001642 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fqm\" (UniqueName: \"kubernetes.io/projected/f9d839f0-e881-471e-aaf6-a948bb298b17-kube-api-access-n9fqm\") pod \"dns-operator-744455d44c-j6jgh\" (UID: \"f9d839f0-e881-471e-aaf6-a948bb298b17\") " pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001672 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-config\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001701 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2b447ff0-4b72-429b-a255-bbd745131936-kube-api-access-7lq2r\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001734 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001763 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78904868-f0f9-4198-ac3a-130af7060c38-apiservice-cert\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001792 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzpjl\" (UniqueName: \"kubernetes.io/projected/e31ed34c-4127-4040-91fb-c53b671f9ab5-kube-api-access-xzpjl\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001821 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e7eb9d0-b927-442e-be78-72787f67986c-auth-proxy-config\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001870 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tscnx\" (UniqueName: \"kubernetes.io/projected/34e83d3a-faaf-4720-85d2-1430c65810fd-kube-api-access-tscnx\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.001971 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcw9f\" (UniqueName: \"kubernetes.io/projected/da00c73e-dcd3-4fb7-aedd-77c84ea82855-kube-api-access-xcw9f\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002024 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002084 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00c73e-dcd3-4fb7-aedd-77c84ea82855-config\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002139 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxrtq\" (UniqueName: \"kubernetes.io/projected/87ef4dec-e273-41a2-96de-6c9cc05122d2-kube-api-access-nxrtq\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002202 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002285 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-etcd-client\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002360 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-serving-cert\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002418 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjgf\" (UniqueName: \"kubernetes.io/projected/1a671e58-ffed-46d3-ae24-460febf09dea-kube-api-access-mxjgf\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002488 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvn9g\" (UniqueName: \"kubernetes.io/projected/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-kube-api-access-wvn9g\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002537 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da00c73e-dcd3-4fb7-aedd-77c84ea82855-images\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002569 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-trusted-ca\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002588 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-proxy-tls\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002609 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002630 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-image-import-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002658 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-encryption-config\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002698 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-plugins-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002719 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-signing-key\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002752 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7pbs\" (UniqueName: \"kubernetes.io/projected/4eab433d-51c6-4d3e-8c47-329eb8b06c52-kube-api-access-z7pbs\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002778 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002798 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-service-ca\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002823 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b429dea-1750-4927-a2bb-9ca8f00c4083-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002850 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002901 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f312b88c-5c97-446d-9d7b-e717ac2124fb-cert\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002903 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9d839f0-e881-471e-aaf6-a948bb298b17-metrics-tls\") pod \"dns-operator-744455d44c-j6jgh\" (UID: \"f9d839f0-e881-471e-aaf6-a948bb298b17\") " pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002925 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plvqb\" (UniqueName: \"kubernetes.io/projected/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-kube-api-access-plvqb\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002944 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-service-ca\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002962 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-node-pullsecrets\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002980 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.002998 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d5441c46-ed2e-45d3-8cab-6493dd503085-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wpz6v\" (UID: \"d5441c46-ed2e-45d3-8cab-6493dd503085\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003026 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4c5\" (UniqueName: \"kubernetes.io/projected/c7efed8f-30b4-470a-9ee5-94f38ed51f37-kube-api-access-pt4c5\") pod \"package-server-manager-789f6589d5-xmxmq\" (UID: \"c7efed8f-30b4-470a-9ee5-94f38ed51f37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003045 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003064 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-serving-cert\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003081 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b429dea-1750-4927-a2bb-9ca8f00c4083-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003109 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/78904868-f0f9-4198-ac3a-130af7060c38-tmpfs\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003130 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a671e58-ffed-46d3-ae24-460febf09dea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003149 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2xk\" (UniqueName: \"kubernetes.io/projected/29aee87b-0598-4b50-9b1a-beacaf6d7275-kube-api-access-vf2xk\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003156 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-oauth-config\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003172 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4tlw\" (UniqueName: \"kubernetes.io/projected/d9424aaa-698a-43e0-ae1c-614cc4c538a6-kube-api-access-h4tlw\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003209 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78904868-f0f9-4198-ac3a-130af7060c38-webhook-cert\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003255 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-metrics-certs\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003305 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31ed34c-4127-4040-91fb-c53b671f9ab5-config-volume\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003338 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003372 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003408 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-policies\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003439 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-client\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003471 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtqxf\" (UniqueName: \"kubernetes.io/projected/1601fa84-c51f-451f-8538-6ee23ed108c1-kube-api-access-vtqxf\") pod \"migrator-59844c95c7-25vps\" (UID: \"1601fa84-c51f-451f-8538-6ee23ed108c1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003500 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtxfv\" (UniqueName: \"kubernetes.io/projected/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-kube-api-access-jtxfv\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003530 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-metrics-tls\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003558 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-node-bootstrap-token\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003586 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-images\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003613 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-config\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003638 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2b447ff0-4b72-429b-a255-bbd745131936-srv-cert\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003673 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-audit-policies\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003701 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-etcd-client\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003732 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4eab433d-51c6-4d3e-8c47-329eb8b06c52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003757 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-oauth-serving-cert\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003794 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003818 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5zc\" (UniqueName: \"kubernetes.io/projected/1bc315a4-bf12-48d0-aa24-da64d82a31f3-kube-api-access-fh5zc\") pod \"cluster-samples-operator-665b6dd947-8xr5z\" (UID: \"1bc315a4-bf12-48d0-aa24-da64d82a31f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003841 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llcw8\" (UniqueName: \"kubernetes.io/projected/f312b88c-5c97-446d-9d7b-e717ac2124fb-kube-api-access-llcw8\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003865 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-config\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003890 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7eb9d0-b927-442e-be78-72787f67986c-config\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003916 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4l5f\" (UniqueName: \"kubernetes.io/projected/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-kube-api-access-v4l5f\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003944 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-etcd-serving-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.003972 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da00c73e-dcd3-4fb7-aedd-77c84ea82855-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004000 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s59qx\" (UniqueName: \"kubernetes.io/projected/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-kube-api-access-s59qx\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004025 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-serving-cert\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004079 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-srv-cert\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004106 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhb7f\" (UniqueName: \"kubernetes.io/projected/45d7e969-0ef5-4ba5-8259-09dbe9eec354-kube-api-access-hhb7f\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004129 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqj6\" (UniqueName: \"kubernetes.io/projected/b6e9d146-2d36-4313-9f02-2db06b5b5573-kube-api-access-pqqj6\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004148 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004181 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-audit-dir\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004164 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6e9d146-2d36-4313-9f02-2db06b5b5573-metrics-tls\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004265 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da00c73e-dcd3-4fb7-aedd-77c84ea82855-images\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004299 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004335 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45d7e969-0ef5-4ba5-8259-09dbe9eec354-audit-dir\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004374 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-config\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004415 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68zpb\" (UniqueName: \"kubernetes.io/projected/5eadb650-b9f5-4f66-a038-0a381546b35d-kube-api-access-68zpb\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004454 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrdp\" (UniqueName: \"kubernetes.io/projected/00d74c2b-550a-43a4-858a-be942ffece17-kube-api-access-kdrdp\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004483 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-audit\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004516 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004548 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9424aaa-698a-43e0-ae1c-614cc4c538a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004583 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-socket-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004816 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-socket-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.005495 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.006193 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.006303 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a671e58-ffed-46d3-ae24-460febf09dea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.006805 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-trusted-ca\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.007206 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.007322 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-trusted-ca-bundle\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.007531 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-plugins-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.007676 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-service-ca\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.007755 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-node-pullsecrets\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.008266 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00d74c2b-550a-43a4-858a-be942ffece17-service-ca-bundle\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.009686 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2e7eb9d0-b927-442e-be78-72787f67986c-machine-approver-tls\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.009993 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.010111 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.010351 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.010477 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e7eb9d0-b927-442e-be78-72787f67986c-auth-proxy-config\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.010909 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-policies\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.011109 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.011373 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.011741 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.511719932 +0000 UTC m=+139.365863164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.011953 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da00c73e-dcd3-4fb7-aedd-77c84ea82855-config\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.011974 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-encryption-config\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.013427 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eadb650-b9f5-4f66-a038-0a381546b35d-serving-cert\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.013491 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.013582 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-csi-data-dir\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.013838 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-client\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.014060 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.014122 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45d7e969-0ef5-4ba5-8259-09dbe9eec354-audit-dir\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.014656 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7eb9d0-b927-442e-be78-72787f67986c-config\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.014982 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-etcd-service-ca\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.015549 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.016301 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-config\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.016381 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45d7e969-0ef5-4ba5-8259-09dbe9eec354-audit-policies\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.016436 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.017055 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-serving-cert\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.017352 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eadb650-b9f5-4f66-a038-0a381546b35d-config\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.017648 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/78904868-f0f9-4198-ac3a-130af7060c38-tmpfs\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.017693 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a671e58-ffed-46d3-ae24-460febf09dea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.017376 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1bc315a4-bf12-48d0-aa24-da64d82a31f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8xr5z\" (UID: \"1bc315a4-bf12-48d0-aa24-da64d82a31f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.017756 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.004080 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-stats-auth\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.018319 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da00c73e-dcd3-4fb7-aedd-77c84ea82855-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.018952 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4eab433d-51c6-4d3e-8c47-329eb8b06c52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.019577 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-oauth-serving-cert\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.019690 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-metrics-tls\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.020523 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.021607 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.021636 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-serving-cert\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.022273 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45d7e969-0ef5-4ba5-8259-09dbe9eec354-etcd-client\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.022531 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-default-certificate\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.040890 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.048537 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00d74c2b-550a-43a4-858a-be942ffece17-metrics-certs\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.060917 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.081183 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.099873 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.103966 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/654c3af8-4315-43f4-aedf-366422a88358-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.106464 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.106766 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.606725567 +0000 UTC m=+139.460868609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.107024 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.107591 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.607571921 +0000 UTC m=+139.461714963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.121058 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.124553 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/654c3af8-4315-43f4-aedf-366422a88358-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.177784 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g699f\" (UniqueName: \"kubernetes.io/projected/bf4a2793-dfb2-475f-9f1a-4e48261cf8a1-kube-api-access-g699f\") pod \"authentication-operator-69f744f599-ngf8v\" (UID: \"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.194794 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6a6950f-165f-4419-8f44-4e65c42c51b4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.209527 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.209858 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.709792115 +0000 UTC m=+139.563935157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.209994 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.210570 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.710549237 +0000 UTC m=+139.564692459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.214016 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.257960 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkfj\" (UniqueName: \"kubernetes.io/projected/d6a6950f-165f-4419-8f44-4e65c42c51b4-kube-api-access-tvkfj\") pod \"cluster-image-registry-operator-dc59b4c8b-44xlf\" (UID: \"d6a6950f-165f-4419-8f44-4e65c42c51b4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.279755 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.283895 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4f8t\" (UniqueName: \"kubernetes.io/projected/676a5a14-3672-401d-8cbe-df1c5b4081be-kube-api-access-j4f8t\") pod \"kube-storage-version-migrator-operator-b67b599dd-chcp5\" (UID: \"676a5a14-3672-401d-8cbe-df1c5b4081be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.293224 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.312700 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.313494 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.314282 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.814197163 +0000 UTC m=+139.668340215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.316108 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.316657 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.816639984 +0000 UTC m=+139.670783026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.322497 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.332348 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.339750 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.368441 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.381088 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-signing-cabundle\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.383782 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.400541 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.407928 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31ed34c-4127-4040-91fb-c53b671f9ab5-secret-volume\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.411095 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2b447ff0-4b72-429b-a255-bbd745131936-profile-collector-cert\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.417199 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.417509 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.417670 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.917635892 +0000 UTC m=+139.771779124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.421779 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.425810 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31ed34c-4127-4040-91fb-c53b671f9ab5-config-volume\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.464366 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.464704 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.477435 4574 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.477512 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert podName:699add67-bf01-4799-80ff-615e4ea6da01 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.977493453 +0000 UTC m=+139.831636495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert") pod "openshift-config-operator-7777fb866f-dzvnb" (UID: "699add67-bf01-4799-80ff-615e4ea6da01") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.481003 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485130 4574 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485162 4574 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485192 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.985173847 +0000 UTC m=+139.839316889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485354 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.985325831 +0000 UTC m=+139.839468893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485124 4574 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485400 4574 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485404 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.985396023 +0000 UTC m=+139.839539065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.485429 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:33.985423214 +0000 UTC m=+139.839566256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.492505 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-signing-key\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.509994 4574 request.go:700] Waited for 1.015557396s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.514557 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.520117 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.520554 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.521289 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.021275717 +0000 UTC m=+139.875418769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.544557 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.550658 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2b447ff0-4b72-429b-a255-bbd745131936-srv-cert\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.560804 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.581606 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.588692 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ngf8v"] Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.594572 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7efed8f-30b4-470a-9ee5-94f38ed51f37-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xmxmq\" (UID: \"c7efed8f-30b4-470a-9ee5-94f38ed51f37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.603945 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.621671 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.622647 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.122604175 +0000 UTC m=+139.976747217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.623641 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.625380 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-config\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.640810 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.647369 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-serving-cert\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.660063 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.680473 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.691308 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-srv-cert\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.699850 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.719849 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.725201 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.725710 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.225696834 +0000 UTC m=+140.079839876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.731472 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d5441c46-ed2e-45d3-8cab-6493dd503085-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wpz6v\" (UID: \"d5441c46-ed2e-45d3-8cab-6493dd503085\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.739606 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.759841 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.768828 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-images\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.779655 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.787147 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78904868-f0f9-4198-ac3a-130af7060c38-webhook-cert\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.787438 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78904868-f0f9-4198-ac3a-130af7060c38-apiservice-cert\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.800292 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.804219 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4eab433d-51c6-4d3e-8c47-329eb8b06c52-proxy-tls\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.820266 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.826683 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.826803 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.326777865 +0000 UTC m=+140.180920907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.826919 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.827632 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.327605899 +0000 UTC m=+140.181748951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.839366 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf"] Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.840457 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 04 04:48:33 crc kubenswrapper[4574]: W1004 04:48:33.847261 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a6950f_165f_4419_8f44_4e65c42c51b4.slice/crio-4dfb401757d7416330b4cb03adaa453e62b100a5792878bb45335554a69c6ef5 WatchSource:0}: Error finding container 4dfb401757d7416330b4cb03adaa453e62b100a5792878bb45335554a69c6ef5: Status 404 returned error can't find the container with id 4dfb401757d7416330b4cb03adaa453e62b100a5792878bb45335554a69c6ef5 Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.852751 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5"] Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.854154 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-proxy-tls\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.860205 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 04 04:48:33 crc kubenswrapper[4574]: W1004 04:48:33.868853 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676a5a14_3672_401d_8cbe_df1c5b4081be.slice/crio-df587b7cada4acaf97386a850a3ef1abbbf55269728909ecdde0f950d57101af WatchSource:0}: Error finding container df587b7cada4acaf97386a850a3ef1abbbf55269728909ecdde0f950d57101af: Status 404 returned error can't find the container with id df587b7cada4acaf97386a850a3ef1abbbf55269728909ecdde0f950d57101af Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.882686 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.886626 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-serving-cert\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.891504 4574 secret.go:188] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.891657 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-serving-cert podName:0e96869e-a5cb-4b5e-b99f-04f3097b8d4c nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.391633552 +0000 UTC m=+140.245776594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-serving-cert") pod "console-operator-58897d9998-ms6sm" (UID: "0e96869e-a5cb-4b5e-b99f-04f3097b8d4c") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.900805 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.919907 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.930595 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.930780 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.43075709 +0000 UTC m=+140.284900132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.932501 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:33 crc kubenswrapper[4574]: E1004 04:48:33.933709 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.433691026 +0000 UTC m=+140.287834068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.938869 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-encryption-config\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.939558 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.949407 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-config\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.960142 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.969087 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-etcd-client\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.980730 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 04 04:48:33 crc kubenswrapper[4574]: I1004 04:48:33.986080 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-etcd-serving-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.001133 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.001214 4574 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.001327 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6e9d146-2d36-4313-9f02-2db06b5b5573-config-volume podName:b6e9d146-2d36-4313-9f02-2db06b5b5573 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.501302283 +0000 UTC m=+140.355445455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b6e9d146-2d36-4313-9f02-2db06b5b5573-config-volume") pod "dns-default-zxrvh" (UID: "b6e9d146-2d36-4313-9f02-2db06b5b5573") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.005062 4574 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.005136 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6e9d146-2d36-4313-9f02-2db06b5b5573-metrics-tls podName:b6e9d146-2d36-4313-9f02-2db06b5b5573 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.505116524 +0000 UTC m=+140.359259556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b6e9d146-2d36-4313-9f02-2db06b5b5573-metrics-tls") pod "dns-default-zxrvh" (UID: "b6e9d146-2d36-4313-9f02-2db06b5b5573") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.005827 4574 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.006002 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f312b88c-5c97-446d-9d7b-e717ac2124fb-cert podName:f312b88c-5c97-446d-9d7b-e717ac2124fb nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.505981439 +0000 UTC m=+140.360124481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f312b88c-5c97-446d-9d7b-e717ac2124fb-cert") pod "ingress-canary-2qqsp" (UID: "f312b88c-5c97-446d-9d7b-e717ac2124fb") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.007766 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-audit\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.007885 4574 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.007933 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-image-import-ca podName:b80b22b2-92cb-4d46-aaa1-1b20a9b38445 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.507918425 +0000 UTC m=+140.362061467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-image-import-ca") pod "apiserver-76f77b778f-hwfs9" (UID: "b80b22b2-92cb-4d46-aaa1-1b20a9b38445") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.008092 4574 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.008285 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-trusted-ca-bundle podName:b80b22b2-92cb-4d46-aaa1-1b20a9b38445 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.508261955 +0000 UTC m=+140.362405277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-trusted-ca-bundle") pod "apiserver-76f77b778f-hwfs9" (UID: "b80b22b2-92cb-4d46-aaa1-1b20a9b38445") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.012475 4574 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.012559 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-certs podName:35618a3f-3250-4767-892c-06d7cf99e0a9 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.51254122 +0000 UTC m=+140.366684262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-certs") pod "machine-config-server-g2zvs" (UID: "35618a3f-3250-4767-892c-06d7cf99e0a9") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.013347 4574 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.013399 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9b429dea-1750-4927-a2bb-9ca8f00c4083-config podName:9b429dea-1750-4927-a2bb-9ca8f00c4083 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.513388694 +0000 UTC m=+140.367531906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/9b429dea-1750-4927-a2bb-9ca8f00c4083-config") pod "openshift-apiserver-operator-796bbdcf4f-24wzl" (UID: "9b429dea-1750-4927-a2bb-9ca8f00c4083") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.015818 4574 secret.go:188] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.015953 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b429dea-1750-4927-a2bb-9ca8f00c4083-serving-cert podName:9b429dea-1750-4927-a2bb-9ca8f00c4083 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.515938889 +0000 UTC m=+140.370082131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9b429dea-1750-4927-a2bb-9ca8f00c4083-serving-cert") pod "openshift-apiserver-operator-796bbdcf4f-24wzl" (UID: "9b429dea-1750-4927-a2bb-9ca8f00c4083") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.016617 4574 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.016760 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-node-bootstrap-token podName:35618a3f-3250-4767-892c-06d7cf99e0a9 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.516747252 +0000 UTC m=+140.370890294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-node-bootstrap-token") pod "machine-config-server-g2zvs" (UID: "35618a3f-3250-4767-892c-06d7cf99e0a9") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.016820 4574 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.019002 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9424aaa-698a-43e0-ae1c-614cc4c538a6-control-plane-machine-set-operator-tls podName:d9424aaa-698a-43e0-ae1c-614cc4c538a6 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.518695499 +0000 UTC m=+140.372838731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/d9424aaa-698a-43e0-ae1c-614cc4c538a6-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-x7jjx" (UID: "d9424aaa-698a-43e0-ae1c-614cc4c538a6") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.020381 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.035117 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.035361 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.535334793 +0000 UTC m=+140.389477835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.035905 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.036063 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.036214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.036268 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.036329 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.036406 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.536396304 +0000 UTC m=+140.390539346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.037176 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.047604 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.059553 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.084005 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.101199 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.121322 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.138607 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.139538 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.639523134 +0000 UTC m=+140.493666176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.140338 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.160381 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.181545 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.200099 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.215710 4574 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.220164 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.234124 4574 projected.go:288] Couldn't get configMap openshift-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.240402 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.241516 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.242406 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.742393317 +0000 UTC m=+140.596536359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.259614 4574 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.280643 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.301488 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.307392 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" event={"ID":"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1","Type":"ContainerStarted","Data":"a109abbeecb9bcee9b77af8c1fad60d2acd95943618f7c70160571fd81f8ef00"} Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.307468 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" event={"ID":"bf4a2793-dfb2-475f-9f1a-4e48261cf8a1","Type":"ContainerStarted","Data":"f07580713b10574bbf381f987d0b2b1331b1beae22aad2bea52e400b58b9205c"} Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.309637 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" event={"ID":"676a5a14-3672-401d-8cbe-df1c5b4081be","Type":"ContainerStarted","Data":"3c99f87d06c6a89b10460afa9ff3262594d33ac9d42a9bb3e54fd190c6444c8c"} Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.309785 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" event={"ID":"676a5a14-3672-401d-8cbe-df1c5b4081be","Type":"ContainerStarted","Data":"df587b7cada4acaf97386a850a3ef1abbbf55269728909ecdde0f950d57101af"} Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.312176 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" event={"ID":"d6a6950f-165f-4419-8f44-4e65c42c51b4","Type":"ContainerStarted","Data":"452531b0a9132814791622b6e46b5aa1b82cf92692cac90042a27b4ae6fd357f"} Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.312484 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" event={"ID":"d6a6950f-165f-4419-8f44-4e65c42c51b4","Type":"ContainerStarted","Data":"4dfb401757d7416330b4cb03adaa453e62b100a5792878bb45335554a69c6ef5"} Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.320724 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.340362 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.343168 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.343448 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.843399486 +0000 UTC m=+140.697542538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.344214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.345173 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.845155927 +0000 UTC m=+140.699298979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.361002 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.379362 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.400451 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.419921 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.439403 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.445521 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.445761 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.945731083 +0000 UTC m=+140.799874115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.446405 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.446587 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-serving-cert\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.448862 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:34.948853174 +0000 UTC m=+140.802996206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.460403 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.481363 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.518629 4574 request.go:700] Waited for 1.630740553s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/serviceaccounts/kube-apiserver-operator/token Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.537082 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fd256b9-ec48-40a6-9b1e-5ad98b721c71-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6qs2p\" (UID: \"6fd256b9-ec48-40a6-9b1e-5ad98b721c71\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.548801 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549100 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6e9d146-2d36-4313-9f02-2db06b5b5573-config-volume\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549217 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-certs\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549402 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-image-import-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549441 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b429dea-1750-4927-a2bb-9ca8f00c4083-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549463 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f312b88c-5c97-446d-9d7b-e717ac2124fb-cert\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549498 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549533 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b429dea-1750-4927-a2bb-9ca8f00c4083-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549588 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-node-bootstrap-token\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549685 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6e9d146-2d36-4313-9f02-2db06b5b5573-metrics-tls\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.549732 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9424aaa-698a-43e0-ae1c-614cc4c538a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.550877 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-image-import-ca\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.551537 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.051504209 +0000 UTC m=+140.905647251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.551804 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.552011 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b429dea-1750-4927-a2bb-9ca8f00c4083-config\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.552126 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6e9d146-2d36-4313-9f02-2db06b5b5573-config-volume\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.554942 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-node-bootstrap-token\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.555466 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9424aaa-698a-43e0-ae1c-614cc4c538a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.555657 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/35618a3f-3250-4767-892c-06d7cf99e0a9-certs\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.555890 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f312b88c-5c97-446d-9d7b-e717ac2124fb-cert\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.557562 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b6e9d146-2d36-4313-9f02-2db06b5b5573-metrics-tls\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.559275 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/050c17bb-6aa3-49bd-a875-c2088ffd1799-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7hs9d\" (UID: \"050c17bb-6aa3-49bd-a875-c2088ffd1799\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.567800 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b429dea-1750-4927-a2bb-9ca8f00c4083-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.586323 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.594684 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8tgg\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-kube-api-access-r8tgg\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.616339 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-bound-sa-token\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.636450 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf5tj\" (UniqueName: \"kubernetes.io/projected/2e7eb9d0-b927-442e-be78-72787f67986c-kube-api-access-lf5tj\") pod \"machine-approver-56656f9798-rb475\" (UID: \"2e7eb9d0-b927-442e-be78-72787f67986c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.665039 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbfst\" (UniqueName: \"kubernetes.io/projected/7b39b6a0-b01d-4c0f-aebc-948e613cfe4f-kube-api-access-wbfst\") pod \"service-ca-operator-777779d784-7sktt\" (UID: \"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.668785 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.669962 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.169945015 +0000 UTC m=+141.024088057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.672711 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.682872 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/654c3af8-4315-43f4-aedf-366422a88358-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fj4st\" (UID: \"654c3af8-4315-43f4-aedf-366422a88358\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.697570 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4tlw\" (UniqueName: \"kubernetes.io/projected/d9424aaa-698a-43e0-ae1c-614cc4c538a6-kube-api-access-h4tlw\") pod \"control-plane-machine-set-operator-78cbb6b69f-x7jjx\" (UID: \"d9424aaa-698a-43e0-ae1c-614cc4c538a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.717503 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjgf\" (UniqueName: \"kubernetes.io/projected/1a671e58-ffed-46d3-ae24-460febf09dea-kube-api-access-mxjgf\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkh4p\" (UID: \"1a671e58-ffed-46d3-ae24-460febf09dea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.737159 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvn9g\" (UniqueName: \"kubernetes.io/projected/a1a868b0-c592-465e-b6a0-cb0a3c73dbd8-kube-api-access-wvn9g\") pod \"olm-operator-6b444d44fb-plrnd\" (UID: \"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.759750 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rzx8\" (UniqueName: \"kubernetes.io/projected/a90a7c9e-a3f1-4992-85ea-c8b539f1123f-kube-api-access-4rzx8\") pod \"csi-hostpathplugin-kv6r9\" (UID: \"a90a7c9e-a3f1-4992-85ea-c8b539f1123f\") " pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.765375 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d"] Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.774658 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlbms\" (UniqueName: \"kubernetes.io/projected/9b429dea-1750-4927-a2bb-9ca8f00c4083-kube-api-access-vlbms\") pod \"openshift-apiserver-operator-796bbdcf4f-24wzl\" (UID: \"9b429dea-1750-4927-a2bb-9ca8f00c4083\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.774691 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.774907 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.274872378 +0000 UTC m=+141.129015420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.775033 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.775476 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.275463295 +0000 UTC m=+141.129606537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.788478 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.795867 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plvqb\" (UniqueName: \"kubernetes.io/projected/c94e6a0d-58e8-40b5-b818-b38b9d79ced1-kube-api-access-plvqb\") pod \"machine-config-operator-74547568cd-hr8xq\" (UID: \"c94e6a0d-58e8-40b5-b818-b38b9d79ced1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.816947 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.818461 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7pbs\" (UniqueName: \"kubernetes.io/projected/4eab433d-51c6-4d3e-8c47-329eb8b06c52-kube-api-access-z7pbs\") pod \"machine-config-controller-84d6567774-nrsmf\" (UID: \"4eab433d-51c6-4d3e-8c47-329eb8b06c52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.836849 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzpjl\" (UniqueName: \"kubernetes.io/projected/e31ed34c-4127-4040-91fb-c53b671f9ab5-kube-api-access-xzpjl\") pod \"collect-profiles-29325885-zs6xq\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.845501 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.852454 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.859215 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4c5\" (UniqueName: \"kubernetes.io/projected/c7efed8f-30b4-470a-9ee5-94f38ed51f37-kube-api-access-pt4c5\") pod \"package-server-manager-789f6589d5-xmxmq\" (UID: \"c7efed8f-30b4-470a-9ee5-94f38ed51f37\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.871875 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.877482 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tscb9\" (UniqueName: \"kubernetes.io/projected/35618a3f-3250-4767-892c-06d7cf99e0a9-kube-api-access-tscb9\") pod \"machine-config-server-g2zvs\" (UID: \"35618a3f-3250-4767-892c-06d7cf99e0a9\") " pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.878698 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.878899 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.378852963 +0000 UTC m=+141.232996005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.879577 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.881302 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.381276263 +0000 UTC m=+141.235419305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.899936 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.915957 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.920813 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.927206 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fqm\" (UniqueName: \"kubernetes.io/projected/f9d839f0-e881-471e-aaf6-a948bb298b17-kube-api-access-n9fqm\") pod \"dns-operator-744455d44c-j6jgh\" (UID: \"f9d839f0-e881-471e-aaf6-a948bb298b17\") " pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.932656 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2b447ff0-4b72-429b-a255-bbd745131936-kube-api-access-7lq2r\") pod \"catalog-operator-68c6474976-njnv9\" (UID: \"2b447ff0-4b72-429b-a255-bbd745131936\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.941079 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlzqt\" (UniqueName: \"kubernetes.io/projected/78904868-f0f9-4198-ac3a-130af7060c38-kube-api-access-jlzqt\") pod \"packageserver-d55dfcdfc-qvwr6\" (UID: \"78904868-f0f9-4198-ac3a-130af7060c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.944136 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.968081 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtt5\" (UniqueName: \"kubernetes.io/projected/d5441c46-ed2e-45d3-8cab-6493dd503085-kube-api-access-fvtt5\") pod \"multus-admission-controller-857f4d67dd-wpz6v\" (UID: \"d5441c46-ed2e-45d3-8cab-6493dd503085\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.972767 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g2zvs" Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.982132 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:34 crc kubenswrapper[4574]: E1004 04:48:34.982855 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.482833708 +0000 UTC m=+141.336976750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:34 crc kubenswrapper[4574]: I1004 04:48:34.987899 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tscnx\" (UniqueName: \"kubernetes.io/projected/34e83d3a-faaf-4720-85d2-1430c65810fd-kube-api-access-tscnx\") pod \"oauth-openshift-558db77b4-pcvkf\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.002474 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcw9f\" (UniqueName: \"kubernetes.io/projected/da00c73e-dcd3-4fb7-aedd-77c84ea82855-kube-api-access-xcw9f\") pod \"machine-api-operator-5694c8668f-hkp92\" (UID: \"da00c73e-dcd3-4fb7-aedd-77c84ea82855\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.006843 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.018139 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.021649 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxrtq\" (UniqueName: \"kubernetes.io/projected/87ef4dec-e273-41a2-96de-6c9cc05122d2-kube-api-access-nxrtq\") pod \"console-f9d7485db-l8x2m\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.040137 4574 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.040319 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert podName:699add67-bf01-4799-80ff-615e4ea6da01 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.040274689 +0000 UTC m=+141.894417731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert") pod "openshift-config-operator-7777fb866f-dzvnb" (UID: "699add67-bf01-4799-80ff-615e4ea6da01") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041162 4574 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041206 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.041194166 +0000 UTC m=+141.895337208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041331 4574 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041374 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.041362471 +0000 UTC m=+141.895505513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync secret cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041414 4574 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041439 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.041433313 +0000 UTC m=+141.895576355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041471 4574 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.041490 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.041484404 +0000 UTC m=+141.895627446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.043138 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq6kw\" (UniqueName: \"kubernetes.io/projected/69b2231e-4f54-4554-8e7a-d46e644d6b81-kube-api-access-fq6kw\") pod \"downloads-7954f5f757-2nmbr\" (UID: \"69b2231e-4f54-4554-8e7a-d46e644d6b81\") " pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.061474 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.080288 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.083891 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.084390 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.584374992 +0000 UTC m=+141.438518034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.101885 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.113169 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcw8\" (UniqueName: \"kubernetes.io/projected/f312b88c-5c97-446d-9d7b-e717ac2124fb-kube-api-access-llcw8\") pod \"ingress-canary-2qqsp\" (UID: \"f312b88c-5c97-446d-9d7b-e717ac2124fb\") " pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.113716 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtqxf\" (UniqueName: \"kubernetes.io/projected/1601fa84-c51f-451f-8538-6ee23ed108c1-kube-api-access-vtqxf\") pod \"migrator-59844c95c7-25vps\" (UID: \"1601fa84-c51f-451f-8538-6ee23ed108c1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.114033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5zc\" (UniqueName: \"kubernetes.io/projected/1bc315a4-bf12-48d0-aa24-da64d82a31f3-kube-api-access-fh5zc\") pod \"cluster-samples-operator-665b6dd947-8xr5z\" (UID: \"1bc315a4-bf12-48d0-aa24-da64d82a31f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.119967 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtxfv\" (UniqueName: \"kubernetes.io/projected/b80b22b2-92cb-4d46-aaa1-1b20a9b38445-kube-api-access-jtxfv\") pod \"apiserver-76f77b778f-hwfs9\" (UID: \"b80b22b2-92cb-4d46-aaa1-1b20a9b38445\") " pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.120590 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.130556 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.160574 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4l5f\" (UniqueName: \"kubernetes.io/projected/3ee82682-0c4d-4c04-ad10-5ce85fa21f1f-kube-api-access-v4l5f\") pod \"service-ca-9c57cc56f-tn7qm\" (UID: \"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.162461 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.178861 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.183359 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.185263 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68zpb\" (UniqueName: \"kubernetes.io/projected/5eadb650-b9f5-4f66-a038-0a381546b35d-kube-api-access-68zpb\") pod \"etcd-operator-b45778765-wwzd7\" (UID: \"5eadb650-b9f5-4f66-a038-0a381546b35d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.190970 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.207669 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.707615558 +0000 UTC m=+141.561758600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.207886 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.208442 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.708434641 +0000 UTC m=+141.562577683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.208622 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.215145 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrdp\" (UniqueName: \"kubernetes.io/projected/00d74c2b-550a-43a4-858a-be942ffece17-kube-api-access-kdrdp\") pod \"router-default-5444994796-ms27n\" (UID: \"00d74c2b-550a-43a4-858a-be942ffece17\") " pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.216392 4574 projected.go:288] Couldn't get configMap openshift-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.216439 4574 projected.go:194] Error preparing data for projected volume kube-api-access-hlz45 for pod openshift-controller-manager/controller-manager-879f6c89f-k52jj: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.216561 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45 podName:f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.716518557 +0000 UTC m=+141.570661599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hlz45" (UniqueName: "kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45") pod "controller-manager-879f6c89f-k52jj" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.225102 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhb7f\" (UniqueName: \"kubernetes.io/projected/45d7e969-0ef5-4ba5-8259-09dbe9eec354-kube-api-access-hhb7f\") pod \"apiserver-7bbb656c7d-q6brr\" (UID: \"45d7e969-0ef5-4ba5-8259-09dbe9eec354\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.229974 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.234751 4574 projected.go:288] Couldn't get configMap openshift-config-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.234779 4574 projected.go:194] Error preparing data for projected volume kube-api-access-m7pqf for pod openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb: failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.234834 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/699add67-bf01-4799-80ff-615e4ea6da01-kube-api-access-m7pqf podName:699add67-bf01-4799-80ff-615e4ea6da01 nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.734814709 +0000 UTC m=+141.588957751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m7pqf" (UniqueName: "kubernetes.io/projected/699add67-bf01-4799-80ff-615e4ea6da01-kube-api-access-m7pqf") pod "openshift-config-operator-7777fb866f-dzvnb" (UID: "699add67-bf01-4799-80ff-615e4ea6da01") : failed to sync configmap cache: timed out waiting for the condition Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.240894 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s59qx\" (UniqueName: \"kubernetes.io/projected/e45f01df-92bd-4fe0-b70e-cce7a0215e8a-kube-api-access-s59qx\") pod \"ingress-operator-5b745b69d9-xjdmx\" (UID: \"e45f01df-92bd-4fe0-b70e-cce7a0215e8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.255947 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.256255 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2qqsp" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.269802 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2xk\" (UniqueName: \"kubernetes.io/projected/29aee87b-0598-4b50-9b1a-beacaf6d7275-kube-api-access-vf2xk\") pod \"marketplace-operator-79b997595-44hzk\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.309773 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.310265 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.810248174 +0000 UTC m=+141.664391216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.310401 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.316550 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.323920 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqj6\" (UniqueName: \"kubernetes.io/projected/b6e9d146-2d36-4313-9f02-2db06b5b5573-kube-api-access-pqqj6\") pod \"dns-default-zxrvh\" (UID: \"b6e9d146-2d36-4313-9f02-2db06b5b5573\") " pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.324649 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.327540 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.331042 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.347438 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.349449 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" event={"ID":"2e7eb9d0-b927-442e-be78-72787f67986c","Type":"ContainerStarted","Data":"38ba33b1ca973aa75dec5059dafbd5ecc26fe31398fc041911bc8facd946fd61"} Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.349500 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" event={"ID":"2e7eb9d0-b927-442e-be78-72787f67986c","Type":"ContainerStarted","Data":"fd825477a76205867927d44cc8acade7f458448f86bd0d3c56cf1f658c60d6c0"} Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.350276 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.351263 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.360146 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.362417 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g2zvs" event={"ID":"35618a3f-3250-4767-892c-06d7cf99e0a9","Type":"ContainerStarted","Data":"4a7f5299eceb85a87dc6e57ccccf1dd61fe60deb0c18780a06b5943480f58604"} Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.364736 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" event={"ID":"654c3af8-4315-43f4-aedf-366422a88358","Type":"ContainerStarted","Data":"fe9c6a0ba1b764ac6c8750f36a5c54fda4e7d64ca32297b23d2b17b00a8e55aa"} Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.367174 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-479vz\" (UniqueName: \"kubernetes.io/projected/f3111436-b5d8-405e-ab14-2fb33bd107c0-kube-api-access-479vz\") pod \"route-controller-manager-6576b87f9c-n25jn\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.372330 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.378923 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.381249 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.385108 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" event={"ID":"050c17bb-6aa3-49bd-a875-c2088ffd1799","Type":"ContainerStarted","Data":"0fb1e78af7891796f268ad8389766a6231079211e6e31147f96839f70417c3a5"} Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.385142 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" event={"ID":"050c17bb-6aa3-49bd-a875-c2088ffd1799","Type":"ContainerStarted","Data":"281e3a688727e8dfab134c5f583b0d7f7942a69038d88531a27deb7a58a887a7"} Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.387604 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.394815 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.396382 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-serving-cert\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.401439 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.411434 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.411686 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.414165 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:35.914151716 +0000 UTC m=+141.768294758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.420053 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.440682 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.465368 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.483066 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.500692 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.510790 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rv2\" (UniqueName: \"kubernetes.io/projected/0e96869e-a5cb-4b5e-b99f-04f3097b8d4c-kube-api-access-66rv2\") pod \"console-operator-58897d9998-ms6sm\" (UID: \"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c\") " pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.515801 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.015774643 +0000 UTC m=+141.869917685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.515694 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.516073 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.518392 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.018382319 +0000 UTC m=+141.872525361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.538345 4574 request.go:700] Waited for 1.863391071s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.545971 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.548722 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.558786 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.560625 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.562955 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.565658 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.659880 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.661831 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.666317 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.666339 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.16622398 +0000 UTC m=+142.020367022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.678245 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.679052 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.679715 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.179699562 +0000 UTC m=+142.033842604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.736845 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7sktt"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.749152 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.767165 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.780516 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.780894 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.280869336 +0000 UTC m=+142.135012368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.782108 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlz45\" (UniqueName: \"kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.782142 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pqf\" (UniqueName: \"kubernetes.io/projected/699add67-bf01-4799-80ff-615e4ea6da01-kube-api-access-m7pqf\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.782217 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.783301 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.283285416 +0000 UTC m=+142.137428458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.807946 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pqf\" (UniqueName: \"kubernetes.io/projected/699add67-bf01-4799-80ff-615e4ea6da01-kube-api-access-m7pqf\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.833670 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.854762 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlz45\" (UniqueName: \"kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.885846 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.886710 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.386671284 +0000 UTC m=+142.240814326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.905723 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pcvkf"] Oct 04 04:48:35 crc kubenswrapper[4574]: I1004 04:48:35.991950 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:35 crc kubenswrapper[4574]: E1004 04:48:35.992424 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.49241202 +0000 UTC m=+142.346555062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: W1004 04:48:36.069832 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b429dea_1750_4927_a2bb_9ca8f00c4083.slice/crio-4eb667d9fcc34629f0672e9baa8d6e068ff036367c56b467f54110f897347404 WatchSource:0}: Error finding container 4eb667d9fcc34629f0672e9baa8d6e068ff036367c56b467f54110f897347404: Status 404 returned error can't find the container with id 4eb667d9fcc34629f0672e9baa8d6e068ff036367c56b467f54110f897347404 Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.093697 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.094291 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.094330 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.094386 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.094407 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.094427 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.102473 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.103023 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.103099 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.6030875 +0000 UTC m=+142.457230542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.113788 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: W1004 04:48:36.119586 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc94e6a0d_58e8_40b5_b818_b38b9d79ced1.slice/crio-af2cc823544520014d9e7732f50b0c326d8888ab9374c797edeeb4749c96f1ed WatchSource:0}: Error finding container af2cc823544520014d9e7732f50b0c326d8888ab9374c797edeeb4749c96f1ed: Status 404 returned error can't find the container with id af2cc823544520014d9e7732f50b0c326d8888ab9374c797edeeb4749c96f1ed Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.132601 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") pod \"controller-manager-879f6c89f-k52jj\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.134477 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699add67-bf01-4799-80ff-615e4ea6da01-serving-cert\") pod \"openshift-config-operator-7777fb866f-dzvnb\" (UID: \"699add67-bf01-4799-80ff-615e4ea6da01\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.196659 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.197331 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.697299931 +0000 UTC m=+142.551442973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.205380 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.224848 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.260652 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:36 crc kubenswrapper[4574]: W1004 04:48:36.262374 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e83d3a_faaf_4720_85d2_1430c65810fd.slice/crio-f59bcf1f1142884018f77030b0a4dcccda99357764470d2955906650f947264f WatchSource:0}: Error finding container f59bcf1f1142884018f77030b0a4dcccda99357764470d2955906650f947264f: Status 404 returned error can't find the container with id f59bcf1f1142884018f77030b0a4dcccda99357764470d2955906650f947264f Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.296268 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.306888 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.307532 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.807506007 +0000 UTC m=+142.661649049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.385080 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.413652 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.414138 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:36.914115529 +0000 UTC m=+142.768258581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: W1004 04:48:36.436327 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a671e58_ffed_46d3_ae24_460febf09dea.slice/crio-9b3bcf9dbb8df6275d443752e47680d04ef9ca5f9c26f242d787f84d296e9b6e WatchSource:0}: Error finding container 9b3bcf9dbb8df6275d443752e47680d04ef9ca5f9c26f242d787f84d296e9b6e: Status 404 returned error can't find the container with id 9b3bcf9dbb8df6275d443752e47680d04ef9ca5f9c26f242d787f84d296e9b6e Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.446935 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g2zvs" event={"ID":"35618a3f-3250-4767-892c-06d7cf99e0a9","Type":"ContainerStarted","Data":"4f315ca6e4b51957e9f8bfe3073f630c23e27aec80d1d628b99cdc3f32f54a60"} Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.515550 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.517008 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.016953261 +0000 UTC m=+142.871096303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.518109 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.518719 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.018672741 +0000 UTC m=+142.872815963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.619820 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.621251 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.121212864 +0000 UTC m=+142.975355906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.642730 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" event={"ID":"d9424aaa-698a-43e0-ae1c-614cc4c538a6","Type":"ContainerStarted","Data":"f41ad2ece8eb781fd0d2caf180d7b062835ca25215cf2caf4a4f8f77029e780f"} Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.678374 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" event={"ID":"34e83d3a-faaf-4720-85d2-1430c65810fd","Type":"ContainerStarted","Data":"f59bcf1f1142884018f77030b0a4dcccda99357764470d2955906650f947264f"} Oct 04 04:48:36 crc kubenswrapper[4574]: W1004 04:48:36.690173 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode31ed34c_4127_4040_91fb_c53b671f9ab5.slice/crio-7217d6ffcca59f1383a58cd7db06b8369b44bee5f51de9fc9c741175ec944437 WatchSource:0}: Error finding container 7217d6ffcca59f1383a58cd7db06b8369b44bee5f51de9fc9c741175ec944437: Status 404 returned error can't find the container with id 7217d6ffcca59f1383a58cd7db06b8369b44bee5f51de9fc9c741175ec944437 Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.723452 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.724036 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.224019075 +0000 UTC m=+143.078162107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.824953 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.838116 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.338050162 +0000 UTC m=+143.192193214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925610 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" event={"ID":"4eab433d-51c6-4d3e-8c47-329eb8b06c52","Type":"ContainerStarted","Data":"78190d87e0cd7d629dc27bc37ac2a17c1cece09de1af18e57287c7126fe1a43e"} Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925649 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" event={"ID":"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8","Type":"ContainerStarted","Data":"1f266cbac413fa8e5030a1ee5e63580c43393a2025f14ad50815eb021bd65625"} Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925667 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j6jgh"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925681 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925692 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" event={"ID":"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f","Type":"ContainerStarted","Data":"c83fcfc9b7c12b0de6ed73159cbce7cb66bf56141c3be3f117eba0fef3b39cfe"} Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925710 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kv6r9"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925722 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925732 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.925740 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wpz6v"] Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.938580 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:36 crc kubenswrapper[4574]: E1004 04:48:36.939042 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.43902633 +0000 UTC m=+143.293169372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.943660 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ms27n" event={"ID":"00d74c2b-550a-43a4-858a-be942ffece17","Type":"ContainerStarted","Data":"d69ffacc4362763e9922a981a4be1383ef7b8c026cea5f06a16078feb62f683b"} Oct 04 04:48:36 crc kubenswrapper[4574]: W1004 04:48:36.960382 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b447ff0_4b72_429b_a255_bbd745131936.slice/crio-377f6c81ed7143ad995f0c29eb1114dcf1965c8d76a7df87591ef613dd5c91f1 WatchSource:0}: Error finding container 377f6c81ed7143ad995f0c29eb1114dcf1965c8d76a7df87591ef613dd5c91f1: Status 404 returned error can't find the container with id 377f6c81ed7143ad995f0c29eb1114dcf1965c8d76a7df87591ef613dd5c91f1 Oct 04 04:48:36 crc kubenswrapper[4574]: I1004 04:48:36.965607 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-l8x2m"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.012524 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" event={"ID":"9b429dea-1750-4927-a2bb-9ca8f00c4083","Type":"ContainerStarted","Data":"4eb667d9fcc34629f0672e9baa8d6e068ff036367c56b467f54110f897347404"} Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.037659 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" event={"ID":"6fd256b9-ec48-40a6-9b1e-5ad98b721c71","Type":"ContainerStarted","Data":"bb912693945f3b030c2775bd68da7150e61e4451b66809c5401f7e14a48eea53"} Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.039835 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.040572 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.540543984 +0000 UTC m=+143.394687026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.060150 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" event={"ID":"c94e6a0d-58e8-40b5-b818-b38b9d79ced1","Type":"ContainerStarted","Data":"af2cc823544520014d9e7732f50b0c326d8888ab9374c797edeeb4749c96f1ed"} Oct 04 04:48:37 crc kubenswrapper[4574]: W1004 04:48:37.112950 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5441c46_ed2e_45d3_8cab_6493dd503085.slice/crio-af92136d6805e8aef1848c1c06e8820447133597f006c1e24fb0ec3bebffa55f WatchSource:0}: Error finding container af92136d6805e8aef1848c1c06e8820447133597f006c1e24fb0ec3bebffa55f: Status 404 returned error can't find the container with id af92136d6805e8aef1848c1c06e8820447133597f006c1e24fb0ec3bebffa55f Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.127479 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ngf8v" podStartSLOduration=120.127431322 podStartE2EDuration="2m0.127431322s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:37.068110416 +0000 UTC m=+142.922253458" watchObservedRunningTime="2025-10-04 04:48:37.127431322 +0000 UTC m=+142.981574364" Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.142730 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.144789 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.644765786 +0000 UTC m=+143.498908828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: W1004 04:48:37.151512 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ef4dec_e273_41a2_96de_6c9cc05122d2.slice/crio-e30879a049cbb99ef99f028c91c30a0c2c1eb4ec8df3abbb1744e61eb8623dca WatchSource:0}: Error finding container e30879a049cbb99ef99f028c91c30a0c2c1eb4ec8df3abbb1744e61eb8623dca: Status 404 returned error can't find the container with id e30879a049cbb99ef99f028c91c30a0c2c1eb4ec8df3abbb1744e61eb8623dca Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.179251 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hs9d" podStartSLOduration=120.179205328 podStartE2EDuration="2m0.179205328s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:37.178335473 +0000 UTC m=+143.032478515" watchObservedRunningTime="2025-10-04 04:48:37.179205328 +0000 UTC m=+143.033348370" Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.247505 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.254972 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.75490989 +0000 UTC m=+143.609052932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.283784 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hwfs9"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.295854 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2qqsp"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.323852 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-44xlf" podStartSLOduration=120.323820555 podStartE2EDuration="2m0.323820555s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:37.321969461 +0000 UTC m=+143.176112523" watchObservedRunningTime="2025-10-04 04:48:37.323820555 +0000 UTC m=+143.177963597" Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.341516 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.361931 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.362450 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.862433949 +0000 UTC m=+143.716576981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.399809 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g2zvs" podStartSLOduration=5.399785725 podStartE2EDuration="5.399785725s" podCreationTimestamp="2025-10-04 04:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:37.398883569 +0000 UTC m=+143.253026611" watchObservedRunningTime="2025-10-04 04:48:37.399785725 +0000 UTC m=+143.253928767" Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.404621 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2nmbr"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.462641 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.463097 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:37.963075257 +0000 UTC m=+143.817218299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.467989 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wwzd7"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.565476 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.567828 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.067808344 +0000 UTC m=+143.921951386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.667999 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.674793 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.168696579 +0000 UTC m=+144.022839621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.773559 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.774837 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.274817346 +0000 UTC m=+144.128960388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.802798 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-chcp5" podStartSLOduration=120.796002163 podStartE2EDuration="2m0.796002163s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:37.675354383 +0000 UTC m=+143.529497415" watchObservedRunningTime="2025-10-04 04:48:37.796002163 +0000 UTC m=+143.650145205" Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.804664 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tn7qm"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.812452 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.821323 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.847897 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hkp92"] Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.875062 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.879801 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.37977514 +0000 UTC m=+144.233918182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:37 crc kubenswrapper[4574]: I1004 04:48:37.981873 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:37 crc kubenswrapper[4574]: E1004 04:48:37.982224 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.48221306 +0000 UTC m=+144.336356102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.030833 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ms6sm"] Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.061444 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-44hzk"] Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.092977 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.093060 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.593040033 +0000 UTC m=+144.447183075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.093613 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.094423 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.594406433 +0000 UTC m=+144.448549475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.100291 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zxrvh"] Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.102387 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" event={"ID":"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f","Type":"ContainerStarted","Data":"fbad879e3f49c9c97bd167462de53cd7d219bc3aa066742c63cbd8ac7f7bc294"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.115194 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" event={"ID":"d9424aaa-698a-43e0-ae1c-614cc4c538a6","Type":"ContainerStarted","Data":"e827a4b97661cd4eb23e28c35847d57f1bcb5eb8906e253d908b3c45ab70acf6"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.133872 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn"] Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.141013 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" event={"ID":"b80b22b2-92cb-4d46-aaa1-1b20a9b38445","Type":"ContainerStarted","Data":"5161f9520af73a2b762fe2bfccc003b0f6b961a0616c3d665a9865b7ea581d8f"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.154581 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x7jjx" podStartSLOduration=121.154541043 podStartE2EDuration="2m1.154541043s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:38.14825032 +0000 UTC m=+144.002393362" watchObservedRunningTime="2025-10-04 04:48:38.154541043 +0000 UTC m=+144.008684075" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.175753 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" event={"ID":"2b447ff0-4b72-429b-a255-bbd745131936","Type":"ContainerStarted","Data":"377f6c81ed7143ad995f0c29eb1114dcf1965c8d76a7df87591ef613dd5c91f1"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.180732 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k52jj"] Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.184321 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" event={"ID":"c7efed8f-30b4-470a-9ee5-94f38ed51f37","Type":"ContainerStarted","Data":"920b768d3692522dd2e2b4cb3234480ee0e7d028c196566b1889045c0c1cfb8d"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.194987 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.196651 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.696621357 +0000 UTC m=+144.550764399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.207746 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" event={"ID":"2e7eb9d0-b927-442e-be78-72787f67986c","Type":"ContainerStarted","Data":"c471d91b08ad7c9da17b77a30659f03aa6a1254b91e7bd97d3051a9c394b40e1"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.220303 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" event={"ID":"4eab433d-51c6-4d3e-8c47-329eb8b06c52","Type":"ContainerStarted","Data":"1947db98168b1d66ec842c62a77482fc985a3641e26772775c359a474bf9eccb"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.224450 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2qqsp" event={"ID":"f312b88c-5c97-446d-9d7b-e717ac2124fb","Type":"ContainerStarted","Data":"3a50b9ea1cc4d8a804cf2a1042410c6eb0c8f554b34d0bd1e57a0d9bbf50741b"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.227255 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" event={"ID":"e45f01df-92bd-4fe0-b70e-cce7a0215e8a","Type":"ContainerStarted","Data":"7a0d786e7b25b71d7760fe8508e56a42b67fea0113936923f618d8aab86b0d30"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.239179 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2nmbr" event={"ID":"69b2231e-4f54-4554-8e7a-d46e644d6b81","Type":"ContainerStarted","Data":"dc7b49aa71f22845330664a650332e5281d1df479c228ce5883687a6ab0ffe65"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.244425 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" event={"ID":"f9d839f0-e881-471e-aaf6-a948bb298b17","Type":"ContainerStarted","Data":"7fbf1e59b2e21120337488c8e2be6ff132b67a1109d1efc4688252de76f31ffe"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.249437 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" event={"ID":"1601fa84-c51f-451f-8538-6ee23ed108c1","Type":"ContainerStarted","Data":"1163c34dce8b2ce69432c52ba3f34f9e20478d699fbf340cdcf5db76fc177ca4"} Oct 04 04:48:38 crc kubenswrapper[4574]: W1004 04:48:38.254924 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29aee87b_0598_4b50_9b1a_beacaf6d7275.slice/crio-1754df0411b3661e9c15ccb36f7961821200df49e68b4cfd1308aba3b8aca11a WatchSource:0}: Error finding container 1754df0411b3661e9c15ccb36f7961821200df49e68b4cfd1308aba3b8aca11a: Status 404 returned error can't find the container with id 1754df0411b3661e9c15ccb36f7961821200df49e68b4cfd1308aba3b8aca11a Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.263127 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" event={"ID":"7b39b6a0-b01d-4c0f-aebc-948e613cfe4f","Type":"ContainerStarted","Data":"4c047048d1f945dca99bcc82856cca7de6387603fc2b833733863812e9d7b0ee"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.276336 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" event={"ID":"45d7e969-0ef5-4ba5-8259-09dbe9eec354","Type":"ContainerStarted","Data":"b318ee78d639a8e250812464546cec786677e43e9e04e46c0bd6480dda516463"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.295366 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" event={"ID":"654c3af8-4315-43f4-aedf-366422a88358","Type":"ContainerStarted","Data":"792cf8f5a0e52623ea5cd11964d9627a6ca24d888d7e24f794373872e4900cd5"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.297702 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.299913 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.799891501 +0000 UTC m=+144.654034543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.306852 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" event={"ID":"a90a7c9e-a3f1-4992-85ea-c8b539f1123f","Type":"ContainerStarted","Data":"f6de439c3e6162d8c2251a4fccdc09255a5fbaa4d4591935ecad019d6c2d901a"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.315073 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7sktt" podStartSLOduration=121.315038982 podStartE2EDuration="2m1.315038982s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:38.305796723 +0000 UTC m=+144.159939765" watchObservedRunningTime="2025-10-04 04:48:38.315038982 +0000 UTC m=+144.169182024" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.315442 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rb475" podStartSLOduration=121.315434764 podStartE2EDuration="2m1.315434764s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:38.242616095 +0000 UTC m=+144.096759137" watchObservedRunningTime="2025-10-04 04:48:38.315434764 +0000 UTC m=+144.169577806" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.319780 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l8x2m" event={"ID":"87ef4dec-e273-41a2-96de-6c9cc05122d2","Type":"ContainerStarted","Data":"e30879a049cbb99ef99f028c91c30a0c2c1eb4ec8df3abbb1744e61eb8623dca"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.339056 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" event={"ID":"d5441c46-ed2e-45d3-8cab-6493dd503085","Type":"ContainerStarted","Data":"af92136d6805e8aef1848c1c06e8820447133597f006c1e24fb0ec3bebffa55f"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.365594 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fj4st" podStartSLOduration=121.365565712 podStartE2EDuration="2m1.365565712s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:38.352690358 +0000 UTC m=+144.206833410" watchObservedRunningTime="2025-10-04 04:48:38.365565712 +0000 UTC m=+144.219708754" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.367846 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb"] Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.398120 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.398718 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.898678665 +0000 UTC m=+144.752821887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.398791 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.400688 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:38.900672844 +0000 UTC m=+144.754816106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.409686 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" event={"ID":"e31ed34c-4127-4040-91fb-c53b671f9ab5","Type":"ContainerStarted","Data":"7217d6ffcca59f1383a58cd7db06b8369b44bee5f51de9fc9c741175ec944437"} Oct 04 04:48:38 crc kubenswrapper[4574]: W1004 04:48:38.422843 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699add67_bf01_4799_80ff_615e4ea6da01.slice/crio-303deb7663167570ce97b51ba57a016d5464e58e19ee6c478b798020b944fd6f WatchSource:0}: Error finding container 303deb7663167570ce97b51ba57a016d5464e58e19ee6c478b798020b944fd6f: Status 404 returned error can't find the container with id 303deb7663167570ce97b51ba57a016d5464e58e19ee6c478b798020b944fd6f Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.439195 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" event={"ID":"a1a868b0-c592-465e-b6a0-cb0a3c73dbd8","Type":"ContainerStarted","Data":"dfda3800871b178f17ba5c87bd210504086a432d92f07f7b888ebcf1cc08201b"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.439947 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.443934 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" event={"ID":"78904868-f0f9-4198-ac3a-130af7060c38","Type":"ContainerStarted","Data":"8eb27c73ea3bb1e77d5f463d58213c7b517bcae8b3f14b851db2e33e8d471f7f"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.457318 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" event={"ID":"5eadb650-b9f5-4f66-a038-0a381546b35d","Type":"ContainerStarted","Data":"259a82323fb0106ba1b58bf2caf2f8acd80d4ea8d030ce907d7261afb3eaef35"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.460714 4574 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-plrnd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.460806 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" podUID="a1a868b0-c592-465e-b6a0-cb0a3c73dbd8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.474200 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" podStartSLOduration=121.474166962 podStartE2EDuration="2m1.474166962s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:38.470191956 +0000 UTC m=+144.324334998" watchObservedRunningTime="2025-10-04 04:48:38.474166962 +0000 UTC m=+144.328310004" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.500117 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.526301 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.026267608 +0000 UTC m=+144.880410650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.548706 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" event={"ID":"1a671e58-ffed-46d3-ae24-460febf09dea","Type":"ContainerStarted","Data":"9b3bcf9dbb8df6275d443752e47680d04ef9ca5f9c26f242d787f84d296e9b6e"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.588590 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" event={"ID":"6fd256b9-ec48-40a6-9b1e-5ad98b721c71","Type":"ContainerStarted","Data":"f253a0654e8f12659abe3d6db48832dadbaa91497a589a8560ab43a699378b8e"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.607654 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.608104 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.108085908 +0000 UTC m=+144.962228950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.623549 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6qs2p" podStartSLOduration=121.623521347 podStartE2EDuration="2m1.623521347s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:38.622545599 +0000 UTC m=+144.476688641" watchObservedRunningTime="2025-10-04 04:48:38.623521347 +0000 UTC m=+144.477664389" Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.664822 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" event={"ID":"1bc315a4-bf12-48d0-aa24-da64d82a31f3","Type":"ContainerStarted","Data":"ebf999351d1b340926a175808535c0dfd68d28d4c81e29229047fe86f60be1fc"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.678807 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" event={"ID":"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c","Type":"ContainerStarted","Data":"3665df8d98a1c2a102b8b9866601c082e0ddaa88da768909c17b6e74eddc2de1"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.706824 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" event={"ID":"da00c73e-dcd3-4fb7-aedd-77c84ea82855","Type":"ContainerStarted","Data":"e5ed9b4a94d0c6bdff693fcd068eb035e49d1e38d3345a7be881e00493855f26"} Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.708958 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.709134 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.209098897 +0000 UTC m=+145.063241949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.709473 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.709946 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.209929501 +0000 UTC m=+145.064072713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.811103 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.812481 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.312456794 +0000 UTC m=+145.166599836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:38 crc kubenswrapper[4574]: I1004 04:48:38.920495 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:38 crc kubenswrapper[4574]: E1004 04:48:38.920987 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.420969951 +0000 UTC m=+145.275113003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.021409 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.021618 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.521593778 +0000 UTC m=+145.375736820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.022664 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.023154 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.523135693 +0000 UTC m=+145.377278735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.123264 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.123594 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.623572985 +0000 UTC m=+145.477716027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.224848 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.225248 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.725214892 +0000 UTC m=+145.579357934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.331708 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.332146 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.832127138 +0000 UTC m=+145.686270190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.433381 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.433853 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:39.933836946 +0000 UTC m=+145.787979988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.535040 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.535395 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.035358429 +0000 UTC m=+145.889501471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.645629 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.646287 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.146263865 +0000 UTC m=+146.000406907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.715553 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" event={"ID":"2b447ff0-4b72-429b-a255-bbd745131936","Type":"ContainerStarted","Data":"46cbbb5e3dbe2235d410515cf3675a75ea9da347a1a055e5fd4772274e22357e"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.715940 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.722575 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ms27n" event={"ID":"00d74c2b-550a-43a4-858a-be942ffece17","Type":"ContainerStarted","Data":"658e61144e0d20d02307b653b9fe55343717c4af3718f54a356365be17f7c61d"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.724125 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" event={"ID":"c7efed8f-30b4-470a-9ee5-94f38ed51f37","Type":"ContainerStarted","Data":"128798153cc09ed142ca8d7874c2b9a7a201d2e7c675adbffd8f3b5f26812780"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.725477 4574 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-njnv9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.725559 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" podUID="2b447ff0-4b72-429b-a255-bbd745131936" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.729528 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zxrvh" event={"ID":"b6e9d146-2d36-4313-9f02-2db06b5b5573","Type":"ContainerStarted","Data":"d5d9b73f2a0c99fede624390557084e0d9c04039e99aae9ced8f17ea4d7b032d"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.742559 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" podStartSLOduration=122.742531485 podStartE2EDuration="2m2.742531485s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:39.742403611 +0000 UTC m=+145.596546653" watchObservedRunningTime="2025-10-04 04:48:39.742531485 +0000 UTC m=+145.596674527" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.746510 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.746889 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.2468336 +0000 UTC m=+146.100976642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.747390 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.747771 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.247753267 +0000 UTC m=+146.101896309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.750416 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" event={"ID":"1a671e58-ffed-46d3-ae24-460febf09dea","Type":"ContainerStarted","Data":"0a298caa73f845dc87580ddf74f0a8b2c9b089ec20572cdab6198c9a4ab761bc"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.755699 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" event={"ID":"34e83d3a-faaf-4720-85d2-1430c65810fd","Type":"ContainerStarted","Data":"57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.756999 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.759851 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" event={"ID":"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf","Type":"ContainerStarted","Data":"3d16730d0de202f53d5a81f72b044b0ce64d02b64a64d0740ff7d72084eddf00"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.764389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" event={"ID":"699add67-bf01-4799-80ff-615e4ea6da01","Type":"ContainerStarted","Data":"303deb7663167570ce97b51ba57a016d5464e58e19ee6c478b798020b944fd6f"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.765040 4574 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pcvkf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.765498 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.773226 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ms27n" podStartSLOduration=122.773203207 podStartE2EDuration="2m2.773203207s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:39.769702025 +0000 UTC m=+145.623845067" watchObservedRunningTime="2025-10-04 04:48:39.773203207 +0000 UTC m=+145.627346249" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.779220 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" event={"ID":"c94e6a0d-58e8-40b5-b818-b38b9d79ced1","Type":"ContainerStarted","Data":"17b590f50ea1b09da04f6e6d78e0ea833692d9b590698ef535a4a99a7f230602"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.781934 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l8x2m" event={"ID":"87ef4dec-e273-41a2-96de-6c9cc05122d2","Type":"ContainerStarted","Data":"6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.784664 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" event={"ID":"29aee87b-0598-4b50-9b1a-beacaf6d7275","Type":"ContainerStarted","Data":"1754df0411b3661e9c15ccb36f7961821200df49e68b4cfd1308aba3b8aca11a"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.800549 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" event={"ID":"e31ed34c-4127-4040-91fb-c53b671f9ab5","Type":"ContainerStarted","Data":"c2eee3bf7123889097aa522b92786fdf20f9ade94ad1d357194b2e4803971b59"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.802316 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" event={"ID":"f3111436-b5d8-405e-ab14-2fb33bd107c0","Type":"ContainerStarted","Data":"5f7489fd399f5898efa5b782990148d54322afa00b2c0445269d4f581292d9b2"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.805186 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" event={"ID":"9b429dea-1750-4927-a2bb-9ca8f00c4083","Type":"ContainerStarted","Data":"dca1d83c225a5dbb7fbfc78d958b60a6a140de0598dc3849c0572ce072efbd13"} Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.815635 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-l8x2m" podStartSLOduration=122.815616291 podStartE2EDuration="2m2.815616291s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:39.814412716 +0000 UTC m=+145.668555758" watchObservedRunningTime="2025-10-04 04:48:39.815616291 +0000 UTC m=+145.669759333" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.817428 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" podStartSLOduration=122.817363402 podStartE2EDuration="2m2.817363402s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:39.792835608 +0000 UTC m=+145.646978650" watchObservedRunningTime="2025-10-04 04:48:39.817363402 +0000 UTC m=+145.671506444" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.848752 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.848892 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.348875228 +0000 UTC m=+146.203018270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.849090 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.855338 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.355319126 +0000 UTC m=+146.209462168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.864189 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" podStartSLOduration=122.864167513 podStartE2EDuration="2m2.864167513s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:39.84101432 +0000 UTC m=+145.695157362" watchObservedRunningTime="2025-10-04 04:48:39.864167513 +0000 UTC m=+145.718310555" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.864505 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-24wzl" podStartSLOduration=122.864493723 podStartE2EDuration="2m2.864493723s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:39.86270056 +0000 UTC m=+145.716843612" watchObservedRunningTime="2025-10-04 04:48:39.864493723 +0000 UTC m=+145.718636775" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.871295 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-plrnd" Oct 04 04:48:39 crc kubenswrapper[4574]: I1004 04:48:39.951041 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:39 crc kubenswrapper[4574]: E1004 04:48:39.955525 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.455466759 +0000 UTC m=+146.309609801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.053518 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.053986 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.553971064 +0000 UTC m=+146.408114106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.154099 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.154557 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.654510489 +0000 UTC m=+146.508653531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.256416 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.256891 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.756866476 +0000 UTC m=+146.611009708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.358350 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.358906 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.858882184 +0000 UTC m=+146.713025226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.380528 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.387061 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:40 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:40 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:40 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.387175 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.460530 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.460897 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:40.960886212 +0000 UTC m=+146.815029254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.561658 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.562067 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.062049865 +0000 UTC m=+146.916192907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.663642 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.664436 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.164414953 +0000 UTC m=+147.018558015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.765594 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.765963 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.265934367 +0000 UTC m=+147.120077409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.810600 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" event={"ID":"78904868-f0f9-4198-ac3a-130af7060c38","Type":"ContainerStarted","Data":"858185dc661bb0ae32a191d692f036999c10dfd7e7d377c3f6de5b863d5bfee5"} Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.812578 4574 generic.go:334] "Generic (PLEG): container finished" podID="b80b22b2-92cb-4d46-aaa1-1b20a9b38445" containerID="135930d6570c24437103d73d872ea5efc825ef267fe376bb861ed849a28215b9" exitCode=0 Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.812777 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" event={"ID":"b80b22b2-92cb-4d46-aaa1-1b20a9b38445","Type":"ContainerDied","Data":"135930d6570c24437103d73d872ea5efc825ef267fe376bb861ed849a28215b9"} Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.814582 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2nmbr" event={"ID":"69b2231e-4f54-4554-8e7a-d46e644d6b81","Type":"ContainerStarted","Data":"1dc446c9f24aa97dbec72c0a53b228930ca701b00c96b08880fe1546092d8874"} Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.816498 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" event={"ID":"f9d839f0-e881-471e-aaf6-a948bb298b17","Type":"ContainerStarted","Data":"10ad83161bb8c6511e7844755a8e7b0885fd7764158148f851c65d11bb0a1fd6"} Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.817490 4574 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pcvkf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.817543 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.817880 4574 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-njnv9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.817915 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" podUID="2b447ff0-4b72-429b-a255-bbd745131936" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.853580 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkh4p" podStartSLOduration=123.853543296 podStartE2EDuration="2m3.853543296s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:40.850332692 +0000 UTC m=+146.704475734" watchObservedRunningTime="2025-10-04 04:48:40.853543296 +0000 UTC m=+146.707686338" Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.868073 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.868713 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.368692267 +0000 UTC m=+147.222835309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:40 crc kubenswrapper[4574]: I1004 04:48:40.969903 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:40 crc kubenswrapper[4574]: E1004 04:48:40.971633 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.47160023 +0000 UTC m=+147.325743272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.072220 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.072806 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.572779784 +0000 UTC m=+147.426922826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.169656 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.170628 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.173316 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.173458 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.673427682 +0000 UTC m=+147.527570744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.173585 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b3598bf-0896-4552-883d-48f425fa455d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.173660 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b3598bf-0896-4552-883d-48f425fa455d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.173700 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.174173 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.674157954 +0000 UTC m=+147.528300996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.175781 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.176369 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.183605 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.274769 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.275005 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.774965456 +0000 UTC m=+147.629108498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.275282 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.275404 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b3598bf-0896-4552-883d-48f425fa455d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.275469 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b3598bf-0896-4552-883d-48f425fa455d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.275570 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b3598bf-0896-4552-883d-48f425fa455d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.275720 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.775712578 +0000 UTC m=+147.629855620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.302054 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b3598bf-0896-4552-883d-48f425fa455d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.379811 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.380480 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.880455916 +0000 UTC m=+147.734598958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.384911 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:41 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:41 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:41 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.384985 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.481870 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.482493 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:41.982465823 +0000 UTC m=+147.836608865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.487802 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.599729 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.599934 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.09989043 +0000 UTC m=+147.954033472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.600398 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.600439 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.600477 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.601183 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.601254 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.601690 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.101679272 +0000 UTC m=+147.955822314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.609773 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.610506 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.637736 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.646357 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.656656 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.704542 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.704888 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.204863913 +0000 UTC m=+148.059006955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.806115 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.806676 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.306653134 +0000 UTC m=+148.160796176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.825405 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" event={"ID":"c7efed8f-30b4-470a-9ee5-94f38ed51f37","Type":"ContainerStarted","Data":"74f460116dcbb82ffa4c6550db1b5317b1dd671d865f6a478f377640bea7fe51"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.827205 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" event={"ID":"1bc315a4-bf12-48d0-aa24-da64d82a31f3","Type":"ContainerStarted","Data":"154166a18c4897970152623dc6dd4a99f6db6cabdab8734eaad87c1b87fce53f"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.828509 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" event={"ID":"da00c73e-dcd3-4fb7-aedd-77c84ea82855","Type":"ContainerStarted","Data":"7ec9ef1341af33b007054db48e8ecf09e312f80576f52ff7c3fa55f6198ef9ac"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.829841 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" event={"ID":"1601fa84-c51f-451f-8538-6ee23ed108c1","Type":"ContainerStarted","Data":"1bd7a4835627b4ffa933f7d95771f03998c8616662ce93443e6d7e9d1ecd2adc"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.832182 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" event={"ID":"e45f01df-92bd-4fe0-b70e-cce7a0215e8a","Type":"ContainerStarted","Data":"1c91258594fae603eca551aa56e2e5865ab4ec1cf20811f6c2f0322b35ff6fdf"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.833767 4574 generic.go:334] "Generic (PLEG): container finished" podID="45d7e969-0ef5-4ba5-8259-09dbe9eec354" containerID="3e1ceedff97f1252af9d43e7e0f51d111582500fb6ab8c7409255f0986e6e454" exitCode=0 Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.833833 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" event={"ID":"45d7e969-0ef5-4ba5-8259-09dbe9eec354","Type":"ContainerDied","Data":"3e1ceedff97f1252af9d43e7e0f51d111582500fb6ab8c7409255f0986e6e454"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.838795 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" event={"ID":"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf","Type":"ContainerStarted","Data":"11f6a7b4a35c83287d7aed50c86cf756f8877b7c462f55a3935ed25c886217ac"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.840323 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2qqsp" event={"ID":"f312b88c-5c97-446d-9d7b-e717ac2124fb","Type":"ContainerStarted","Data":"01eeb21b8de7ce0870b81acd1b32db695f24205a83e57926d3ded0cf6b005c21"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.841826 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" event={"ID":"699add67-bf01-4799-80ff-615e4ea6da01","Type":"ContainerStarted","Data":"9d3c8fdd97e141e539f0c96dc79587b4e32f8366ef0fe585d06cc406417d8766"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.843359 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" event={"ID":"d5441c46-ed2e-45d3-8cab-6493dd503085","Type":"ContainerStarted","Data":"e3c9b02cf93454a057d4e6192b6bf68cf5232c18a7aa699e94eecd2edd1b8cc6"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.844821 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" event={"ID":"c94e6a0d-58e8-40b5-b818-b38b9d79ced1","Type":"ContainerStarted","Data":"7b785e010beba2443ddc0af0a8e479e134b36482bc08c528ed024d61cd1df8eb"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.846146 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" event={"ID":"5eadb650-b9f5-4f66-a038-0a381546b35d","Type":"ContainerStarted","Data":"fe36071691fbef1f6b7f26815081daef21455d18f4cd67cbf45c40f8a9efde18"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.847811 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" event={"ID":"3ee82682-0c4d-4c04-ad10-5ce85fa21f1f","Type":"ContainerStarted","Data":"7fdf7e3e83f26fd718fe39c40951b66203dc919056951dde43dcb84c8098d2ea"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.849452 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" event={"ID":"0e96869e-a5cb-4b5e-b99f-04f3097b8d4c","Type":"ContainerStarted","Data":"6809f6e8030998b0a01cedfb492cff24ecec621559c10b39f110f26bc1efaeb1"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.850984 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" event={"ID":"29aee87b-0598-4b50-9b1a-beacaf6d7275","Type":"ContainerStarted","Data":"c37f28ff0ee9f99ff699444636f31d08ee1e6134fb1a44d86f3eed97114865bc"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.852562 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" event={"ID":"4eab433d-51c6-4d3e-8c47-329eb8b06c52","Type":"ContainerStarted","Data":"c2013835e41ed3e2416dc166a26c77ba5850321a9548ce4cf12c6449edb4903d"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.855527 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" event={"ID":"f3111436-b5d8-405e-ab14-2fb33bd107c0","Type":"ContainerStarted","Data":"93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.859926 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zxrvh" event={"ID":"b6e9d146-2d36-4313-9f02-2db06b5b5573","Type":"ContainerStarted","Data":"dd4919495ff8407b091acb805b1293cd5b7fbec22ecc446b439bd8de48511407"} Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.860075 4574 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pcvkf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.860154 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.881758 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.916552 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:41 crc kubenswrapper[4574]: E1004 04:48:41.917342 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.417320074 +0000 UTC m=+148.271463116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.917691 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 04 04:48:41 crc kubenswrapper[4574]: I1004 04:48:41.964494 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.018782 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.019074 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.519059604 +0000 UTC m=+148.373202636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.120268 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.120691 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.62067227 +0000 UTC m=+148.474815312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.222578 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.223284 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.723269045 +0000 UTC m=+148.577412087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.324510 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.324884 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.82484642 +0000 UTC m=+148.678989632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.389664 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:42 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:42 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:42 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.389719 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:42 crc kubenswrapper[4574]: W1004 04:48:42.394015 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d955efcfdf4fe9afa2ae7b933a07609c5883644e04aeee3506bec8b29378d9ad WatchSource:0}: Error finding container d955efcfdf4fe9afa2ae7b933a07609c5883644e04aeee3506bec8b29378d9ad: Status 404 returned error can't find the container with id d955efcfdf4fe9afa2ae7b933a07609c5883644e04aeee3506bec8b29378d9ad Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.427537 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.427957 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:42.927941739 +0000 UTC m=+148.782084781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.529120 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.529539 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.029520185 +0000 UTC m=+148.883663227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: W1004 04:48:42.540055 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-d33b34c33fc71ff172135abf94dc1a487013dea8cdb8fe44a155b9960ed23851 WatchSource:0}: Error finding container d33b34c33fc71ff172135abf94dc1a487013dea8cdb8fe44a155b9960ed23851: Status 404 returned error can't find the container with id d33b34c33fc71ff172135abf94dc1a487013dea8cdb8fe44a155b9960ed23851 Oct 04 04:48:42 crc kubenswrapper[4574]: W1004 04:48:42.553286 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9cf0cfec0647fef83adf15b71cca7dfa3b3d668a25dedc69a3a7fc4f9cf892f9 WatchSource:0}: Error finding container 9cf0cfec0647fef83adf15b71cca7dfa3b3d668a25dedc69a3a7fc4f9cf892f9: Status 404 returned error can't find the container with id 9cf0cfec0647fef83adf15b71cca7dfa3b3d668a25dedc69a3a7fc4f9cf892f9 Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.631365 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.632065 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.132026067 +0000 UTC m=+148.986169109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.732464 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.732807 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.232790868 +0000 UTC m=+149.086933910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.834568 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.835060 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.335040793 +0000 UTC m=+149.189183835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.864266 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7b3598bf-0896-4552-883d-48f425fa455d","Type":"ContainerStarted","Data":"4468ed248a3f8bc39161fd84ce3d8b090759e06bef391820f71fc982057dbace"} Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.865725 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d955efcfdf4fe9afa2ae7b933a07609c5883644e04aeee3506bec8b29378d9ad"} Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.867583 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d33b34c33fc71ff172135abf94dc1a487013dea8cdb8fe44a155b9960ed23851"} Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.870158 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9cf0cfec0647fef83adf15b71cca7dfa3b3d668a25dedc69a3a7fc4f9cf892f9"} Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.871963 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.871996 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.872802 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.878353 4574 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-n25jn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.878447 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.878586 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.878592 4574 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qvwr6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.878606 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.878657 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" podUID="78904868-f0f9-4198-ac3a-130af7060c38" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.906681 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" podStartSLOduration=125.906660847 podStartE2EDuration="2m5.906660847s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:42.904062811 +0000 UTC m=+148.758205853" watchObservedRunningTime="2025-10-04 04:48:42.906660847 +0000 UTC m=+148.760803889" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.936255 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:42 crc kubenswrapper[4574]: E1004 04:48:42.937066 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.43701083 +0000 UTC m=+149.291153882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.973917 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" podStartSLOduration=125.973898163 podStartE2EDuration="2m5.973898163s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:42.971022359 +0000 UTC m=+148.825165401" watchObservedRunningTime="2025-10-04 04:48:42.973898163 +0000 UTC m=+148.828041205" Oct 04 04:48:42 crc kubenswrapper[4574]: I1004 04:48:42.999026 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2nmbr" podStartSLOduration=125.999011374 podStartE2EDuration="2m5.999011374s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:42.997570952 +0000 UTC m=+148.851713994" watchObservedRunningTime="2025-10-04 04:48:42.999011374 +0000 UTC m=+148.853154416" Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.038764 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.042339 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.542323574 +0000 UTC m=+149.396466616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.119841 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2qqsp" podStartSLOduration=11.119816708 podStartE2EDuration="11.119816708s" podCreationTimestamp="2025-10-04 04:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:43.042754396 +0000 UTC m=+148.896897438" watchObservedRunningTime="2025-10-04 04:48:43.119816708 +0000 UTC m=+148.973959750" Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.152296 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.160524 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.660495422 +0000 UTC m=+149.514638464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.162346 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.163219 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.6631953 +0000 UTC m=+149.517338342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.265521 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.265940 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.765924679 +0000 UTC m=+149.620067721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.367243 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.367594 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.867578637 +0000 UTC m=+149.721721679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.389474 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:43 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:43 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:43 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.389533 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.470980 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.471246 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.971187121 +0000 UTC m=+149.825330163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.471678 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.471994 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:43.971976984 +0000 UTC m=+149.826120026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.572024 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.572314 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.07220689 +0000 UTC m=+149.926349952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.572499 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.572838 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.072826008 +0000 UTC m=+149.926969050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.673374 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.673999 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.173982591 +0000 UTC m=+150.028125633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.775102 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.775618 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.275585907 +0000 UTC m=+150.129728949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.878130 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.878757 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.378727598 +0000 UTC m=+150.232870640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.914831 4574 generic.go:334] "Generic (PLEG): container finished" podID="7b3598bf-0896-4552-883d-48f425fa455d" containerID="e8aa7ee5200558f9ded75f165d9c611789ad2ff4b44584d77e5f981b87155657" exitCode=0 Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.915411 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7b3598bf-0896-4552-883d-48f425fa455d","Type":"ContainerDied","Data":"e8aa7ee5200558f9ded75f165d9c611789ad2ff4b44584d77e5f981b87155657"} Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.939008 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"44dcb1112f8d975f5bc5900445c1b90cb6636b12c67abac5dafc645bbc421207"} Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.939862 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.952846 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" event={"ID":"45d7e969-0ef5-4ba5-8259-09dbe9eec354","Type":"ContainerStarted","Data":"97659ab26018bded5c24cb8007c1e146ad10f2d85686896b7d4e83e3a257ed3c"} Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.978278 4574 generic.go:334] "Generic (PLEG): container finished" podID="699add67-bf01-4799-80ff-615e4ea6da01" containerID="9d3c8fdd97e141e539f0c96dc79587b4e32f8366ef0fe585d06cc406417d8766" exitCode=0 Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.978485 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" event={"ID":"699add67-bf01-4799-80ff-615e4ea6da01","Type":"ContainerDied","Data":"9d3c8fdd97e141e539f0c96dc79587b4e32f8366ef0fe585d06cc406417d8766"} Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.979634 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:43 crc kubenswrapper[4574]: E1004 04:48:43.980026 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.480013054 +0000 UTC m=+150.334156096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:43 crc kubenswrapper[4574]: I1004 04:48:43.986857 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" event={"ID":"da00c73e-dcd3-4fb7-aedd-77c84ea82855","Type":"ContainerStarted","Data":"2b31aa6d4e1082f109a4f94fbb8a0234d9a52731f76513e2eaafeaeff90dd926"} Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.001147 4574 generic.go:334] "Generic (PLEG): container finished" podID="e31ed34c-4127-4040-91fb-c53b671f9ab5" containerID="c2eee3bf7123889097aa522b92786fdf20f9ade94ad1d357194b2e4803971b59" exitCode=0 Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.001215 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" event={"ID":"e31ed34c-4127-4040-91fb-c53b671f9ab5","Type":"ContainerDied","Data":"c2eee3bf7123889097aa522b92786fdf20f9ade94ad1d357194b2e4803971b59"} Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.023013 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" podStartSLOduration=127.022990645 podStartE2EDuration="2m7.022990645s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.018325239 +0000 UTC m=+149.872468281" watchObservedRunningTime="2025-10-04 04:48:44.022990645 +0000 UTC m=+149.877133687" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.023684 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"195c435a5341a8e26eee37c3d145ac42d35625f1186da667a10e0a47a0041812"} Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.039472 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" event={"ID":"f9d839f0-e881-471e-aaf6-a948bb298b17","Type":"ContainerStarted","Data":"4053152a4fec1244b7d88033989dc80c8999c0c384699b18033d1af11ce87bcc"} Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.053221 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"76bca67726694c3362f965e2c14066b790598cc9c5e92a2da27f93d6ab5297ff"} Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.081298 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" event={"ID":"1bc315a4-bf12-48d0-aa24-da64d82a31f3","Type":"ContainerStarted","Data":"659b5905e10cd6b19754b95d6c328265c4546229e464d53a7871aa77eda3aca2"} Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082367 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082431 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082490 4574 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-n25jn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082546 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082660 4574 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qvwr6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082684 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" podUID="78904868-f0f9-4198-ac3a-130af7060c38" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082855 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082882 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.082892 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.084120 4574 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-k52jj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.084152 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.084822 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.086385 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.586369489 +0000 UTC m=+150.440512531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.088550 4574 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-44hzk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.088601 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.088633 4574 patch_prober.go:28] interesting pod/console-operator-58897d9998-ms6sm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.088740 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" podUID="0e96869e-a5cb-4b5e-b99f-04f3097b8d4c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.189016 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.195389 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.69536667 +0000 UTC m=+150.549509712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.231573 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hkp92" podStartSLOduration=127.231537232 podStartE2EDuration="2m7.231537232s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.14070565 +0000 UTC m=+149.994848692" watchObservedRunningTime="2025-10-04 04:48:44.231537232 +0000 UTC m=+150.085680274" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.231921 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-j6jgh" podStartSLOduration=127.231913283 podStartE2EDuration="2m7.231913283s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.230176463 +0000 UTC m=+150.084319505" watchObservedRunningTime="2025-10-04 04:48:44.231913283 +0000 UTC m=+150.086056325" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.292958 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.293489 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.793459404 +0000 UTC m=+150.647602446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.383584 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:44 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:44 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:44 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.383963 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.395545 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.396032 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.896015907 +0000 UTC m=+150.750158949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.396315 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8xr5z" podStartSLOduration=127.396282935 podStartE2EDuration="2m7.396282935s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.355615682 +0000 UTC m=+150.209758724" watchObservedRunningTime="2025-10-04 04:48:44.396282935 +0000 UTC m=+150.250425977" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.450572 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tn7qm" podStartSLOduration=127.450543594 podStartE2EDuration="2m7.450543594s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.398816549 +0000 UTC m=+150.252959591" watchObservedRunningTime="2025-10-04 04:48:44.450543594 +0000 UTC m=+150.304686636" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.452000 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" podStartSLOduration=127.451994126 podStartE2EDuration="2m7.451994126s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.450179323 +0000 UTC m=+150.304322365" watchObservedRunningTime="2025-10-04 04:48:44.451994126 +0000 UTC m=+150.306137168" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.498463 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.499095 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:44.999069265 +0000 UTC m=+150.853212317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.530357 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wwzd7" podStartSLOduration=127.530308804 podStartE2EDuration="2m7.530308804s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.497934622 +0000 UTC m=+150.352077664" watchObservedRunningTime="2025-10-04 04:48:44.530308804 +0000 UTC m=+150.384451846" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.568045 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr8xq" podStartSLOduration=127.568015081 podStartE2EDuration="2m7.568015081s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.56485795 +0000 UTC m=+150.419000992" watchObservedRunningTime="2025-10-04 04:48:44.568015081 +0000 UTC m=+150.422158123" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.600792 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.601145 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.101132475 +0000 UTC m=+150.955275517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.639710 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" podStartSLOduration=127.639673876 podStartE2EDuration="2m7.639673876s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.604947506 +0000 UTC m=+150.459090548" watchObservedRunningTime="2025-10-04 04:48:44.639673876 +0000 UTC m=+150.493816918" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.683677 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" podStartSLOduration=127.683652716 podStartE2EDuration="2m7.683652716s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.682104501 +0000 UTC m=+150.536247533" watchObservedRunningTime="2025-10-04 04:48:44.683652716 +0000 UTC m=+150.537795758" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.685083 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nrsmf" podStartSLOduration=127.685073947 podStartE2EDuration="2m7.685073947s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.638473861 +0000 UTC m=+150.492616903" watchObservedRunningTime="2025-10-04 04:48:44.685073947 +0000 UTC m=+150.539216989" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.702174 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.702630 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.202600517 +0000 UTC m=+151.056743559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.702695 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.703144 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.203125792 +0000 UTC m=+151.057268824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.719921 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" podStartSLOduration=127.71988467 podStartE2EDuration="2m7.71988467s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:44.718429018 +0000 UTC m=+150.572572070" watchObservedRunningTime="2025-10-04 04:48:44.71988467 +0000 UTC m=+150.574027702" Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.803728 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.804130 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.30411387 +0000 UTC m=+151.158256912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:44 crc kubenswrapper[4574]: I1004 04:48:44.906004 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:44 crc kubenswrapper[4574]: E1004 04:48:44.906405 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.406389656 +0000 UTC m=+151.260532698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.007128 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.007582 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.507566379 +0000 UTC m=+151.361709421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.027636 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.090420 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" event={"ID":"b80b22b2-92cb-4d46-aaa1-1b20a9b38445","Type":"ContainerStarted","Data":"9bc1421a386a47c6b4cbad5d3f23b0943ec13f3759a1fe4276ba3b80e4dd44eb"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.090465 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" event={"ID":"b80b22b2-92cb-4d46-aaa1-1b20a9b38445","Type":"ContainerStarted","Data":"c9cd9d6cb276a1c8fefffcb5b2f7b3ab301fc60af759bb83d086808e1600faf4"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.109112 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.109502 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.609488405 +0000 UTC m=+151.463631447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.112095 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mvj48"] Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.113163 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.114791 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" event={"ID":"e45f01df-92bd-4fe0-b70e-cce7a0215e8a","Type":"ContainerStarted","Data":"06ee7e7cf7b5683a4ba79d21b17396a6a9154c9b9de98e87787211f8dc181675"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.125529 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zxrvh" event={"ID":"b6e9d146-2d36-4313-9f02-2db06b5b5573","Type":"ContainerStarted","Data":"ce8ac42bbce38dd06d90c11a3dd47cad80e944350548eaafb6d81f8240dcfcd4"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.126180 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.130511 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njnv9" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.131811 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.131928 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.132975 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" event={"ID":"699add67-bf01-4799-80ff-615e4ea6da01","Type":"ContainerStarted","Data":"504ff06aa58e7a72cc74342402655f9ad5b55a06ff2432f3e2a60889a909d009"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.133440 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.136408 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" event={"ID":"d5441c46-ed2e-45d3-8cab-6493dd503085","Type":"ContainerStarted","Data":"8d04b339437c667c6b6587e57c761fbeb2ec3a40c56b7482f107acd2094ac9ea"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.149956 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" event={"ID":"a90a7c9e-a3f1-4992-85ea-c8b539f1123f","Type":"ContainerStarted","Data":"2956fa9ac47bfddaa3c6c38575363731a55011bf8bb10fb9217e14888c726c71"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.159749 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvj48"] Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.172333 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" event={"ID":"1601fa84-c51f-451f-8538-6ee23ed108c1","Type":"ContainerStarted","Data":"091022771ce80643f0babf1c4cff0ce0ecb10347f8580ef08848b16ea60e192e"} Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.173154 4574 patch_prober.go:28] interesting pod/console-operator-58897d9998-ms6sm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.173195 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" podUID="0e96869e-a5cb-4b5e-b99f-04f3097b8d4c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.173689 4574 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-k52jj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.173875 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.174685 4574 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-44hzk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.174719 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.210369 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.212970 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.213036 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.213047 4574 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hwfs9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.25:8443/livez\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.213091 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" podUID="b80b22b2-92cb-4d46-aaa1-1b20a9b38445" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.25:8443/livez\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.217283 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.7172586 +0000 UTC m=+151.571401642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.230918 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.230963 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.232547 4574 patch_prober.go:28] interesting pod/console-f9d7485db-l8x2m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.232589 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-l8x2m" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.245587 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" podStartSLOduration=128.245566363 podStartE2EDuration="2m8.245566363s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:45.181641664 +0000 UTC m=+151.035784706" watchObservedRunningTime="2025-10-04 04:48:45.245566363 +0000 UTC m=+151.099709415" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.246563 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wpz6v" podStartSLOduration=128.246558311 podStartE2EDuration="2m8.246558311s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:45.242856854 +0000 UTC m=+151.096999896" watchObservedRunningTime="2025-10-04 04:48:45.246558311 +0000 UTC m=+151.100701353" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.314180 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-catalog-content\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.314398 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-utilities\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.314444 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvdzq\" (UniqueName: \"kubernetes.io/projected/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-kube-api-access-bvdzq\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.314482 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.333806 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.833785619 +0000 UTC m=+151.687928661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.335475 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.335577 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.336576 4574 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-q6brr container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.336633 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" podUID="45d7e969-0ef5-4ba5-8259-09dbe9eec354" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.348534 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.348589 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.348661 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.348680 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.380620 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.389712 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:45 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:45 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:45 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.390048 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.415787 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.416472 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-catalog-content\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.416643 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-utilities\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.416745 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:45.916704272 +0000 UTC m=+151.770847354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.416914 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvdzq\" (UniqueName: \"kubernetes.io/projected/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-kube-api-access-bvdzq\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.417253 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-utilities\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.418018 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-catalog-content\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.518427 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.520410 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.020395718 +0000 UTC m=+151.874538760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.536257 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-frmzn"] Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.537622 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.547241 4574 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-44hzk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.547306 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.548157 4574 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-44hzk container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.548197 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.596850 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.619162 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.619382 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.119351617 +0000 UTC m=+151.973494659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.619576 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.619635 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-utilities\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.619668 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-catalog-content\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.619907 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.119897563 +0000 UTC m=+151.974040605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.619948 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjpg\" (UniqueName: \"kubernetes.io/projected/3548b80e-8db9-4112-a727-6deaf3242864-kube-api-access-nfjpg\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.712871 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvdzq\" (UniqueName: \"kubernetes.io/projected/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-kube-api-access-bvdzq\") pod \"certified-operators-mvj48\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.721729 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.722179 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjpg\" (UniqueName: \"kubernetes.io/projected/3548b80e-8db9-4112-a727-6deaf3242864-kube-api-access-nfjpg\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.722261 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-utilities\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.722299 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-catalog-content\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.723165 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.223110076 +0000 UTC m=+152.077253118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.723367 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-catalog-content\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.723877 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-utilities\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.729995 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.730058 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frmzn"] Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.825979 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.826539 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.326525265 +0000 UTC m=+152.180668307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.880456 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:48:45 crc kubenswrapper[4574]: I1004 04:48:45.929210 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:45 crc kubenswrapper[4574]: E1004 04:48:45.929511 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.42949633 +0000 UTC m=+152.283639372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.010087 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xjdmx" podStartSLOduration=129.010069485 podStartE2EDuration="2m9.010069485s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:45.81775148 +0000 UTC m=+151.671894522" watchObservedRunningTime="2025-10-04 04:48:46.010069485 +0000 UTC m=+151.864212517" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.010478 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xjdlc"] Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.011377 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.030112 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.032593 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.53257664 +0000 UTC m=+152.386719682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.096445 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zxrvh" podStartSLOduration=14.096424927 podStartE2EDuration="14.096424927s" podCreationTimestamp="2025-10-04 04:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:46.03981396 +0000 UTC m=+151.893957012" watchObservedRunningTime="2025-10-04 04:48:46.096424927 +0000 UTC m=+151.950567969" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.098755 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjdlc"] Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.101143 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5xhgz"] Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.102345 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.131607 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.131775 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.631749175 +0000 UTC m=+152.485892217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.131886 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-utilities\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.131975 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.132005 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-catalog-content\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.132062 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9ll\" (UniqueName: \"kubernetes.io/projected/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-kube-api-access-2g9ll\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.132747 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.632699342 +0000 UTC m=+152.486842564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.147198 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qvwr6" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.233882 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.234188 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-25vps" podStartSLOduration=129.234160474 podStartE2EDuration="2m9.234160474s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:46.22749247 +0000 UTC m=+152.081635532" watchObservedRunningTime="2025-10-04 04:48:46.234160474 +0000 UTC m=+152.088303536" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.234338 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9ll\" (UniqueName: \"kubernetes.io/projected/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-kube-api-access-2g9ll\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.234949 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.734923646 +0000 UTC m=+152.589066698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.237914 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-utilities\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.238005 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-catalog-content\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.238053 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-utilities\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.238171 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6fcx\" (UniqueName: \"kubernetes.io/projected/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-kube-api-access-x6fcx\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.238269 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.238314 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-catalog-content\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.239341 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.739317454 +0000 UTC m=+152.593460496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.243217 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-utilities\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.243491 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-catalog-content\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.244571 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjpg\" (UniqueName: \"kubernetes.io/projected/3548b80e-8db9-4112-a727-6deaf3242864-kube-api-access-nfjpg\") pod \"community-operators-frmzn\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.259106 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.322384 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9ll\" (UniqueName: \"kubernetes.io/projected/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-kube-api-access-2g9ll\") pod \"community-operators-xjdlc\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.330754 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.349971 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.350160 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6fcx\" (UniqueName: \"kubernetes.io/projected/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-kube-api-access-x6fcx\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.350404 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-utilities\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.350484 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-catalog-content\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.351573 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.851547119 +0000 UTC m=+152.705690161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.353669 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-catalog-content\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.371850 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-utilities\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.389790 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:46 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:46 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:46 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.389870 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.456397 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.456815 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:46.956798251 +0000 UTC m=+152.810941293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.458640 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.481083 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6fcx\" (UniqueName: \"kubernetes.io/projected/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-kube-api-access-x6fcx\") pod \"certified-operators-5xhgz\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.508306 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xhgz"] Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.521216 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.557595 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.557893 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.057875572 +0000 UTC m=+152.912018614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.573988 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.574355 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31ed34c-4127-4040-91fb-c53b671f9ab5" containerName="collect-profiles" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.574367 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31ed34c-4127-4040-91fb-c53b671f9ab5" containerName="collect-profiles" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.574516 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31ed34c-4127-4040-91fb-c53b671f9ab5" containerName="collect-profiles" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.574949 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.571184 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" podStartSLOduration=129.571153809 podStartE2EDuration="2m9.571153809s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:46.495690433 +0000 UTC m=+152.349833485" watchObservedRunningTime="2025-10-04 04:48:46.571153809 +0000 UTC m=+152.425296851" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.590955 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.591141 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.621539 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.666465 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31ed34c-4127-4040-91fb-c53b671f9ab5-secret-volume\") pod \"e31ed34c-4127-4040-91fb-c53b671f9ab5\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.666594 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31ed34c-4127-4040-91fb-c53b671f9ab5-config-volume\") pod \"e31ed34c-4127-4040-91fb-c53b671f9ab5\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.666619 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzpjl\" (UniqueName: \"kubernetes.io/projected/e31ed34c-4127-4040-91fb-c53b671f9ab5-kube-api-access-xzpjl\") pod \"e31ed34c-4127-4040-91fb-c53b671f9ab5\" (UID: \"e31ed34c-4127-4040-91fb-c53b671f9ab5\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.666848 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.666883 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.666980 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.668978 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e31ed34c-4127-4040-91fb-c53b671f9ab5-config-volume" (OuterVolumeSpecName: "config-volume") pod "e31ed34c-4127-4040-91fb-c53b671f9ab5" (UID: "e31ed34c-4127-4040-91fb-c53b671f9ab5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.674347 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.1743183 +0000 UTC m=+153.028461342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.685507 4574 patch_prober.go:28] interesting pod/console-operator-58897d9998-ms6sm container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.689445 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" podUID="0e96869e-a5cb-4b5e-b99f-04f3097b8d4c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.689618 4574 patch_prober.go:28] interesting pod/console-operator-58897d9998-ms6sm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.689637 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" podUID="0e96869e-a5cb-4b5e-b99f-04f3097b8d4c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.704208 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31ed34c-4127-4040-91fb-c53b671f9ab5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e31ed34c-4127-4040-91fb-c53b671f9ab5" (UID: "e31ed34c-4127-4040-91fb-c53b671f9ab5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.713772 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31ed34c-4127-4040-91fb-c53b671f9ab5-kube-api-access-xzpjl" (OuterVolumeSpecName: "kube-api-access-xzpjl") pod "e31ed34c-4127-4040-91fb-c53b671f9ab5" (UID: "e31ed34c-4127-4040-91fb-c53b671f9ab5"). InnerVolumeSpecName "kube-api-access-xzpjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.727799 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.776837 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.777068 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.777119 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.777195 4574 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31ed34c-4127-4040-91fb-c53b671f9ab5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.777209 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzpjl\" (UniqueName: \"kubernetes.io/projected/e31ed34c-4127-4040-91fb-c53b671f9ab5-kube-api-access-xzpjl\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.777220 4574 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31ed34c-4127-4040-91fb-c53b671f9ab5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.777285 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.777734 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.277696187 +0000 UTC m=+153.131839229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.820822 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.875106 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.879194 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.879767 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.379751217 +0000 UTC m=+153.233894259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.922026 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.981095 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b3598bf-0896-4552-883d-48f425fa455d-kube-api-access\") pod \"7b3598bf-0896-4552-883d-48f425fa455d\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.981195 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b3598bf-0896-4552-883d-48f425fa455d-kubelet-dir\") pod \"7b3598bf-0896-4552-883d-48f425fa455d\" (UID: \"7b3598bf-0896-4552-883d-48f425fa455d\") " Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.981513 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b3598bf-0896-4552-883d-48f425fa455d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7b3598bf-0896-4552-883d-48f425fa455d" (UID: "7b3598bf-0896-4552-883d-48f425fa455d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.981594 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.982737 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.482703172 +0000 UTC m=+153.336846214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.985565 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3598bf-0896-4552-883d-48f425fa455d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7b3598bf-0896-4552-883d-48f425fa455d" (UID: "7b3598bf-0896-4552-883d-48f425fa455d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:48:46 crc kubenswrapper[4574]: E1004 04:48:46.989930 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.489913602 +0000 UTC m=+153.344056644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.989489 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.990360 4574 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b3598bf-0896-4552-883d-48f425fa455d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:46 crc kubenswrapper[4574]: I1004 04:48:46.990378 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b3598bf-0896-4552-883d-48f425fa455d-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.099422 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.100001 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.599980654 +0000 UTC m=+153.454123706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.140465 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qtgfg"] Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.140881 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3598bf-0896-4552-883d-48f425fa455d" containerName="pruner" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.140903 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3598bf-0896-4552-883d-48f425fa455d" containerName="pruner" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.141019 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3598bf-0896-4552-883d-48f425fa455d" containerName="pruner" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.141917 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.149806 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.175983 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qtgfg"] Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.202185 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.202859 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.702847987 +0000 UTC m=+153.556991029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.279615 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7b3598bf-0896-4552-883d-48f425fa455d","Type":"ContainerDied","Data":"4468ed248a3f8bc39161fd84ce3d8b090759e06bef391820f71fc982057dbace"} Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.279660 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4468ed248a3f8bc39161fd84ce3d8b090759e06bef391820f71fc982057dbace" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.279736 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.319063 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.319333 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bttk\" (UniqueName: \"kubernetes.io/projected/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-kube-api-access-6bttk\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.319372 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-catalog-content\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.319442 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-utilities\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.319535 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.819519771 +0000 UTC m=+153.673662813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.335600 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.336093 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq" event={"ID":"e31ed34c-4127-4040-91fb-c53b671f9ab5","Type":"ContainerDied","Data":"7217d6ffcca59f1383a58cd7db06b8369b44bee5f51de9fc9c741175ec944437"} Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.336119 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7217d6ffcca59f1383a58cd7db06b8369b44bee5f51de9fc9c741175ec944437" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.390535 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:47 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:47 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:47 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.390600 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.422062 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.422105 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-utilities\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.422134 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bttk\" (UniqueName: \"kubernetes.io/projected/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-kube-api-access-6bttk\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.422163 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-catalog-content\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.422572 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-catalog-content\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.429436 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-utilities\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.429733 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:47.923911588 +0000 UTC m=+153.778054630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.523281 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.523832 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.023793784 +0000 UTC m=+153.877936836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.629441 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.629762 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.129751927 +0000 UTC m=+153.983894969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.732922 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.733252 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.233210246 +0000 UTC m=+154.087353288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.733762 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.734128 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.234114373 +0000 UTC m=+154.088257415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.794798 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-khxqf"] Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.795882 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.835482 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.836117 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.33610187 +0000 UTC m=+154.190244912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.842033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bttk\" (UniqueName: \"kubernetes.io/projected/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-kube-api-access-6bttk\") pod \"redhat-marketplace-qtgfg\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.940703 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-catalog-content\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.941097 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkc2\" (UniqueName: \"kubernetes.io/projected/aaec2754-49e7-4b88-b913-1c19269e6b97-kube-api-access-6hkc2\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.941291 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-utilities\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.941419 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:47 crc kubenswrapper[4574]: E1004 04:48:47.941869 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.441852237 +0000 UTC m=+154.295995279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:47 crc kubenswrapper[4574]: I1004 04:48:47.948453 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.044442 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.044864 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.544833093 +0000 UTC m=+154.398976135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.045568 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-catalog-content\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.045621 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkc2\" (UniqueName: \"kubernetes.io/projected/aaec2754-49e7-4b88-b913-1c19269e6b97-kube-api-access-6hkc2\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.045702 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-utilities\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.045755 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.047121 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-catalog-content\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.048009 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-utilities\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.048290 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.548269693 +0000 UTC m=+154.402412745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.073354 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khxqf"] Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.131431 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvj48"] Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.146624 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.147168 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.647141989 +0000 UTC m=+154.501285031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.148177 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkc2\" (UniqueName: \"kubernetes.io/projected/aaec2754-49e7-4b88-b913-1c19269e6b97-kube-api-access-6hkc2\") pod \"redhat-marketplace-khxqf\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.214992 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.250124 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.250510 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.750497336 +0000 UTC m=+154.604640368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.351355 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.351928 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.851911437 +0000 UTC m=+154.706054469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.389430 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:48 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:48 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:48 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.389510 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.417212 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerStarted","Data":"a8b270f981663ace03ddc252409f72167850fd89d6a4a1b6459fa4a630d2a31c"} Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.453278 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.453686 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:48.953673747 +0000 UTC m=+154.807816789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.555495 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.555709 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.055676695 +0000 UTC m=+154.909819737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.556226 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.556743 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.056720845 +0000 UTC m=+154.910863887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.658760 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.659144 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.159127365 +0000 UTC m=+155.013270407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.760197 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.760572 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.260557816 +0000 UTC m=+155.114700858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.874412 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.874747 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.374732796 +0000 UTC m=+155.228875838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.874821 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.875298 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.375225501 +0000 UTC m=+155.229368543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.898389 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8sgv5"] Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.899568 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.910702 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 04 04:48:48 crc kubenswrapper[4574]: I1004 04:48:48.977142 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:48 crc kubenswrapper[4574]: E1004 04:48:48.979001 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.478979399 +0000 UTC m=+155.333122441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.086718 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-utilities\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.086779 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.086888 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m9n9\" (UniqueName: \"kubernetes.io/projected/b7a786e2-3629-456c-a861-3e5abcd343a2-kube-api-access-2m9n9\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.086932 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-catalog-content\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.087321 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.587307381 +0000 UTC m=+155.441450423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.117655 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xhgz"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.139170 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frmzn"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.165126 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8sgv5"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.188787 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.189223 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m9n9\" (UniqueName: \"kubernetes.io/projected/b7a786e2-3629-456c-a861-3e5abcd343a2-kube-api-access-2m9n9\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.189319 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-catalog-content\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.189358 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-utilities\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.189908 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-utilities\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.190171 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-catalog-content\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.190189 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.689917066 +0000 UTC m=+155.544060218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.270388 4574 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dzvnb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.270463 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" podUID="699add67-bf01-4799-80ff-615e4ea6da01" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.272806 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m9n9\" (UniqueName: \"kubernetes.io/projected/b7a786e2-3629-456c-a861-3e5abcd343a2-kube-api-access-2m9n9\") pod \"redhat-operators-8sgv5\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.272925 4574 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dzvnb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.272968 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" podUID="699add67-bf01-4799-80ff-615e4ea6da01" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.294961 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.300438 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.300949 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.800931576 +0000 UTC m=+155.655074618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.340614 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.402740 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.403133 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:49.903116979 +0000 UTC m=+155.757260011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.416859 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4fjgj"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.417906 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.417949 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.423735 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.435451 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:49 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:49 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:49 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.435509 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.541673 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.542192 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.042172874 +0000 UTC m=+155.896315916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.563604 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjdlc"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.591067 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qtgfg"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.644722 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" event={"ID":"a90a7c9e-a3f1-4992-85ea-c8b539f1123f","Type":"ContainerStarted","Data":"1effd257614eeedc34e43fdee6a7721dd0ae4fd47a70dd0a493887a0c3fd9dd5"} Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.645750 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.645973 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-catalog-content\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.646012 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpc8w\" (UniqueName: \"kubernetes.io/projected/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-kube-api-access-dpc8w\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.646036 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-utilities\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.646224 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.146207401 +0000 UTC m=+156.000350443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.653307 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerStarted","Data":"555c491eb17f97474a761ca8787f2e9ec43cbc16513cf3cd8f9d6a6823552f22"} Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.656120 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerStarted","Data":"69531635c2cff6d893e09b18b5d6fa4d48d845de0936ffb6c5322218936e0943"} Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.658551 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8","Type":"ContainerStarted","Data":"81fd46a38b534038b718dd568da8306b8181db5aaff9f036f3b67ff46d0a2f06"} Oct 04 04:48:49 crc kubenswrapper[4574]: W1004 04:48:49.659620 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4de8b5_1b24_4ccd_bb4c_5b6bba86e4d6.slice/crio-9714482480fcdf3560fbc0b7d35dc77e75a7604a822174bfdace1b7c711e8d4a WatchSource:0}: Error finding container 9714482480fcdf3560fbc0b7d35dc77e75a7604a822174bfdace1b7c711e8d4a: Status 404 returned error can't find the container with id 9714482480fcdf3560fbc0b7d35dc77e75a7604a822174bfdace1b7c711e8d4a Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.662514 4574 generic.go:334] "Generic (PLEG): container finished" podID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerID="490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2" exitCode=0 Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.662562 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerDied","Data":"490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2"} Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.664739 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.682220 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4fjgj"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.749335 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.749382 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-catalog-content\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.749424 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpc8w\" (UniqueName: \"kubernetes.io/projected/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-kube-api-access-dpc8w\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.749445 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-utilities\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.749990 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-utilities\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.750684 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.2506522 +0000 UTC m=+156.104795242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.760684 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-catalog-content\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.793411 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khxqf"] Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.849176 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpc8w\" (UniqueName: \"kubernetes.io/projected/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-kube-api-access-dpc8w\") pod \"redhat-operators-4fjgj\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.858831 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.861377 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.360047722 +0000 UTC m=+156.214190764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:49 crc kubenswrapper[4574]: I1004 04:48:49.960910 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:49 crc kubenswrapper[4574]: E1004 04:48:49.961443 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.461430062 +0000 UTC m=+156.315573104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.059017 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.062217 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.062356 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.562328957 +0000 UTC m=+156.416472009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.062662 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.062974 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.562960046 +0000 UTC m=+156.417103078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.163653 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.164147 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.664128359 +0000 UTC m=+156.518271401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.265347 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.265757 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.765743765 +0000 UTC m=+156.619886807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.337494 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.362273 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q6brr" Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.366986 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.367147 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.867120115 +0000 UTC m=+156.721263167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.367424 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.368212 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.868204116 +0000 UTC m=+156.722347158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.390522 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:50 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:50 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:50 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.390962 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.472684 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.473047 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:50.973032046 +0000 UTC m=+156.827175088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.575147 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.575807 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.075795586 +0000 UTC m=+156.929938628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.659479 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.676378 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.676508 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.176477235 +0000 UTC m=+157.030620277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.676681 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.677170 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.177149135 +0000 UTC m=+157.031292187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.692806 4574 generic.go:334] "Generic (PLEG): container finished" podID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerID="11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d" exitCode=0 Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.692921 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerDied","Data":"11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.692950 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerStarted","Data":"9714482480fcdf3560fbc0b7d35dc77e75a7604a822174bfdace1b7c711e8d4a"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.710529 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" event={"ID":"a90a7c9e-a3f1-4992-85ea-c8b539f1123f","Type":"ContainerStarted","Data":"aeb4fbaaf24cbaac7db4caa48d303e1255c2ea14e65b58f4622572272c3585f9"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.729732 4574 generic.go:334] "Generic (PLEG): container finished" podID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerID="bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8" exitCode=0 Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.729807 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerDied","Data":"bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.729838 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerStarted","Data":"cc78f3f93534e373d7a1f1ca8e4a2edb60455a9673c7cc6eda9ccd1d293f0865"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.761512 4574 generic.go:334] "Generic (PLEG): container finished" podID="3548b80e-8db9-4112-a727-6deaf3242864" containerID="8cda84a031fa0f9c3218d7d7a40ab5d3bb712630b53e1f3a98891bc91dcedfa5" exitCode=0 Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.761641 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerDied","Data":"8cda84a031fa0f9c3218d7d7a40ab5d3bb712630b53e1f3a98891bc91dcedfa5"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.779355 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.780729 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.280712908 +0000 UTC m=+157.134855950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.796878 4574 generic.go:334] "Generic (PLEG): container finished" podID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerID="53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627" exitCode=0 Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.796993 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerDied","Data":"53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.797025 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerStarted","Data":"5efc7f24b77bc3aff4c239b70dfb4efdd49573810f6260b9703c166ccac78565"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.825622 4574 generic.go:334] "Generic (PLEG): container finished" podID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerID="43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5" exitCode=0 Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.825748 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerDied","Data":"43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.863036 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8","Type":"ContainerStarted","Data":"c5a02b1848028795529f08810b289bfb6736917e330ec3b8326aa418063cde10"} Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.883721 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.894193 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.394173259 +0000 UTC m=+157.248316301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:50 crc kubenswrapper[4574]: I1004 04:48:50.984772 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:50 crc kubenswrapper[4574]: E1004 04:48:50.985401 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.485381552 +0000 UTC m=+157.339524604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.087614 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.088067 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.588053619 +0000 UTC m=+157.442196661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.135040 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8sgv5"] Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.191217 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.191659 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.691637053 +0000 UTC m=+157.545780095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.293806 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.295009 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.79499333 +0000 UTC m=+157.649136372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.305741 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.305718922 podStartE2EDuration="5.305718922s" podCreationTimestamp="2025-10-04 04:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:51.304124776 +0000 UTC m=+157.158267828" watchObservedRunningTime="2025-10-04 04:48:51.305718922 +0000 UTC m=+157.159861964" Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.312671 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dzvnb" Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.383203 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4fjgj"] Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.391956 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:51 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:51 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:51 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.392016 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.395887 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.396974 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.896934766 +0000 UTC m=+157.751077808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.469308 4574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.497142 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.497805 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:51.997774039 +0000 UTC m=+157.851917241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.598044 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.598478 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:52.098461679 +0000 UTC m=+157.952604721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.700057 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.700494 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:52.200482637 +0000 UTC m=+158.054625679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.715219 4574 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hwfs9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]log ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]etcd ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/generic-apiserver-start-informers ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/max-in-flight-filter ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 04 04:48:51 crc kubenswrapper[4574]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 04 04:48:51 crc kubenswrapper[4574]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/project.openshift.io-projectcache ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/openshift.io-startinformers ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 04 04:48:51 crc kubenswrapper[4574]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 04 04:48:51 crc kubenswrapper[4574]: livez check failed Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.715337 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" podUID="b80b22b2-92cb-4d46-aaa1-1b20a9b38445" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.801152 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.801446 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:52.301423764 +0000 UTC m=+158.155566806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.801556 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.801918 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-04 04:48:52.301906938 +0000 UTC m=+158.156049980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qbwcp" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.898914 4574 generic.go:334] "Generic (PLEG): container finished" podID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerID="9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589" exitCode=0 Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.899100 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerDied","Data":"9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589"} Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.899471 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerStarted","Data":"321f96435a023e2b21d8ed461ef18d1a656e7f1eea05c963975802ebd733f893"} Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.902318 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:51 crc kubenswrapper[4574]: E1004 04:48:51.902699 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-04 04:48:52.40268554 +0000 UTC m=+158.256828582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.925247 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" event={"ID":"a90a7c9e-a3f1-4992-85ea-c8b539f1123f","Type":"ContainerStarted","Data":"210819568d19bd8caaecd480f1d3abd4af08bb5845ddea6359821fb73c6a425a"} Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.938520 4574 generic.go:334] "Generic (PLEG): container finished" podID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerID="3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2" exitCode=0 Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.938612 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerDied","Data":"3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2"} Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.938648 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerStarted","Data":"82336a25cb66e539c3b2750f4d03245599e8be5a9f7ea895fc31dbab4081ea81"} Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.947175 4574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-04T04:48:51.469337162Z","Handler":null,"Name":""} Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.954038 4574 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.954088 4574 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.954922 4574 generic.go:334] "Generic (PLEG): container finished" podID="ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8" containerID="c5a02b1848028795529f08810b289bfb6736917e330ec3b8326aa418063cde10" exitCode=0 Oct 04 04:48:51 crc kubenswrapper[4574]: I1004 04:48:51.954982 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8","Type":"ContainerDied","Data":"c5a02b1848028795529f08810b289bfb6736917e330ec3b8326aa418063cde10"} Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.005278 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.040595 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kv6r9" podStartSLOduration=20.040566071 podStartE2EDuration="20.040566071s" podCreationTimestamp="2025-10-04 04:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:52.038679586 +0000 UTC m=+157.892822628" watchObservedRunningTime="2025-10-04 04:48:52.040566071 +0000 UTC m=+157.894709113" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.208392 4574 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.208461 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.379159 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qbwcp\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.413331 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.416443 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:52 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:52 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:52 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.416590 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.476797 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.502697 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:52 crc kubenswrapper[4574]: I1004 04:48:52.758651 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.385055 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:53 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:53 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:53 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.385120 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.417316 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qbwcp"] Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.424824 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.536528 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kube-api-access\") pod \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.536686 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kubelet-dir\") pod \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\" (UID: \"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8\") " Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.537111 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8" (UID: "ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.565569 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8" (UID: "ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.592583 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zxrvh" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.638304 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:53 crc kubenswrapper[4574]: I1004 04:48:53.638373 4574 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 04 04:48:54 crc kubenswrapper[4574]: I1004 04:48:54.033033 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8","Type":"ContainerDied","Data":"81fd46a38b534038b718dd568da8306b8181db5aaff9f036f3b67ff46d0a2f06"} Oct 04 04:48:54 crc kubenswrapper[4574]: I1004 04:48:54.033088 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81fd46a38b534038b718dd568da8306b8181db5aaff9f036f3b67ff46d0a2f06" Oct 04 04:48:54 crc kubenswrapper[4574]: I1004 04:48:54.033168 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 04 04:48:54 crc kubenswrapper[4574]: I1004 04:48:54.046535 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" event={"ID":"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe","Type":"ContainerStarted","Data":"7d651d632ca0c71d5144d5fe834db2128d7234539a3e77fcc8c27d793503be36"} Oct 04 04:48:54 crc kubenswrapper[4574]: I1004 04:48:54.386535 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:54 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:54 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:54 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:54 crc kubenswrapper[4574]: I1004 04:48:54.386604 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.180179 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" event={"ID":"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe","Type":"ContainerStarted","Data":"c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470"} Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.180653 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.214340 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" podStartSLOduration=138.214321835 podStartE2EDuration="2m18.214321835s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:48:55.211644267 +0000 UTC m=+161.065787309" watchObservedRunningTime="2025-10-04 04:48:55.214321835 +0000 UTC m=+161.068464877" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.220421 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.230417 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hwfs9" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.234531 4574 patch_prober.go:28] interesting pod/console-f9d7485db-l8x2m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.234612 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-l8x2m" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.354643 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.354682 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.354725 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.354755 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.390480 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:55 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:55 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:55 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.390568 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.565159 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:48:55 crc kubenswrapper[4574]: I1004 04:48:55.687244 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ms6sm" Oct 04 04:48:56 crc kubenswrapper[4574]: I1004 04:48:56.384469 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:56 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:56 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:56 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:56 crc kubenswrapper[4574]: I1004 04:48:56.384546 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:57 crc kubenswrapper[4574]: I1004 04:48:57.384148 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:57 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:57 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:57 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:57 crc kubenswrapper[4574]: I1004 04:48:57.384611 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:58 crc kubenswrapper[4574]: I1004 04:48:58.382969 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:58 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:58 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:58 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:58 crc kubenswrapper[4574]: I1004 04:48:58.383023 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:59 crc kubenswrapper[4574]: I1004 04:48:59.384019 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:48:59 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:48:59 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:48:59 crc kubenswrapper[4574]: healthz check failed Oct 04 04:48:59 crc kubenswrapper[4574]: I1004 04:48:59.384157 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:48:59 crc kubenswrapper[4574]: I1004 04:48:59.620017 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:59 crc kubenswrapper[4574]: I1004 04:48:59.632876 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/833018b5-b584-4e77-a95f-fe56f6dd5945-metrics-certs\") pod \"network-metrics-daemon-stmq5\" (UID: \"833018b5-b584-4e77-a95f-fe56f6dd5945\") " pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:48:59 crc kubenswrapper[4574]: I1004 04:48:59.851820 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-stmq5" Oct 04 04:49:00 crc kubenswrapper[4574]: I1004 04:49:00.388849 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:49:00 crc kubenswrapper[4574]: [-]has-synced failed: reason withheld Oct 04 04:49:00 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:49:00 crc kubenswrapper[4574]: healthz check failed Oct 04 04:49:00 crc kubenswrapper[4574]: I1004 04:49:00.389389 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:49:00 crc kubenswrapper[4574]: I1004 04:49:00.672617 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-stmq5"] Oct 04 04:49:01 crc kubenswrapper[4574]: I1004 04:49:01.382687 4574 patch_prober.go:28] interesting pod/router-default-5444994796-ms27n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 04 04:49:01 crc kubenswrapper[4574]: [+]has-synced ok Oct 04 04:49:01 crc kubenswrapper[4574]: [+]process-running ok Oct 04 04:49:01 crc kubenswrapper[4574]: healthz check failed Oct 04 04:49:01 crc kubenswrapper[4574]: I1004 04:49:01.383301 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ms27n" podUID="00d74c2b-550a-43a4-858a-be942ffece17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 04:49:02 crc kubenswrapper[4574]: I1004 04:49:02.387932 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:49:02 crc kubenswrapper[4574]: I1004 04:49:02.395892 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ms27n" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.251529 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.256565 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.349139 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.349647 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.349720 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.349289 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.350135 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.350550 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"1dc446c9f24aa97dbec72c0a53b228930ca701b00c96b08880fe1546092d8874"} pod="openshift-console/downloads-7954f5f757-2nmbr" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.350618 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.350656 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" containerID="cri-o://1dc446c9f24aa97dbec72c0a53b228930ca701b00c96b08880fe1546092d8874" gracePeriod=2 Oct 04 04:49:05 crc kubenswrapper[4574]: I1004 04:49:05.350707 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:07 crc kubenswrapper[4574]: I1004 04:49:07.513546 4574 generic.go:334] "Generic (PLEG): container finished" podID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerID="1dc446c9f24aa97dbec72c0a53b228930ca701b00c96b08880fe1546092d8874" exitCode=0 Oct 04 04:49:07 crc kubenswrapper[4574]: I1004 04:49:07.513616 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2nmbr" event={"ID":"69b2231e-4f54-4554-8e7a-d46e644d6b81","Type":"ContainerDied","Data":"1dc446c9f24aa97dbec72c0a53b228930ca701b00c96b08880fe1546092d8874"} Oct 04 04:49:11 crc kubenswrapper[4574]: W1004 04:49:11.880654 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod833018b5_b584_4e77_a95f_fe56f6dd5945.slice/crio-30de3f795bd5fac606243f492152cbd1f636a2d783fdb684538bbd1cfdccf336 WatchSource:0}: Error finding container 30de3f795bd5fac606243f492152cbd1f636a2d783fdb684538bbd1cfdccf336: Status 404 returned error can't find the container with id 30de3f795bd5fac606243f492152cbd1f636a2d783fdb684538bbd1cfdccf336 Oct 04 04:49:12 crc kubenswrapper[4574]: I1004 04:49:12.509105 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:49:12 crc kubenswrapper[4574]: I1004 04:49:12.574777 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-stmq5" event={"ID":"833018b5-b584-4e77-a95f-fe56f6dd5945","Type":"ContainerStarted","Data":"30de3f795bd5fac606243f492152cbd1f636a2d783fdb684538bbd1cfdccf336"} Oct 04 04:49:15 crc kubenswrapper[4574]: I1004 04:49:15.134938 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xmxmq" Oct 04 04:49:15 crc kubenswrapper[4574]: I1004 04:49:15.348465 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:15 crc kubenswrapper[4574]: I1004 04:49:15.348547 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:19 crc kubenswrapper[4574]: I1004 04:49:19.404958 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:49:19 crc kubenswrapper[4574]: I1004 04:49:19.405456 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:49:22 crc kubenswrapper[4574]: I1004 04:49:22.114920 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 04 04:49:25 crc kubenswrapper[4574]: I1004 04:49:25.352873 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:25 crc kubenswrapper[4574]: I1004 04:49:25.352940 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:33 crc kubenswrapper[4574]: E1004 04:49:33.853624 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 04 04:49:33 crc kubenswrapper[4574]: E1004 04:49:33.854818 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m9n9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8sgv5_openshift-marketplace(b7a786e2-3629-456c-a861-3e5abcd343a2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:33 crc kubenswrapper[4574]: E1004 04:49:33.856069 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8sgv5" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.286288 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8sgv5" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" Oct 04 04:49:35 crc kubenswrapper[4574]: I1004 04:49:35.350300 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:35 crc kubenswrapper[4574]: I1004 04:49:35.350410 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.362336 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.362496 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nfjpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-frmzn_openshift-marketplace(3548b80e-8db9-4112-a727-6deaf3242864): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.363639 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-frmzn" podUID="3548b80e-8db9-4112-a727-6deaf3242864" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.382440 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.383758 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpc8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4fjgj_openshift-marketplace(738405ee-2a5a-4ae0-a9aa-cdbad4fc0005): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.385141 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4fjgj" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.677840 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.678030 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2g9ll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xjdlc_openshift-marketplace(49a1d35e-a6a7-4d50-9ec1-90c3ff9295de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:35 crc kubenswrapper[4574]: E1004 04:49:35.679816 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xjdlc" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.341019 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xjdlc" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.341082 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-frmzn" podUID="3548b80e-8db9-4112-a727-6deaf3242864" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.341643 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4fjgj" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.443952 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.444170 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvdzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mvj48_openshift-marketplace(4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.445444 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mvj48" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.473517 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.473765 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6fcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5xhgz_openshift-marketplace(93ad9da7-d07b-4a0c-8f91-6af543b99e3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:37 crc kubenswrapper[4574]: E1004 04:49:37.475016 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5xhgz" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" Oct 04 04:49:40 crc kubenswrapper[4574]: E1004 04:49:40.609748 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5xhgz" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" Oct 04 04:49:40 crc kubenswrapper[4574]: E1004 04:49:40.609765 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mvj48" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.501599 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.502120 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hkc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-khxqf_openshift-marketplace(aaec2754-49e7-4b88-b913-1c19269e6b97): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.503779 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-khxqf" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.659827 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.659993 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6bttk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qtgfg_openshift-marketplace(5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.662739 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qtgfg" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" Oct 04 04:49:42 crc kubenswrapper[4574]: I1004 04:49:42.742442 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:42 crc kubenswrapper[4574]: I1004 04:49:42.742492 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.745174 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qtgfg" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" Oct 04 04:49:42 crc kubenswrapper[4574]: E1004 04:49:42.745174 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-khxqf" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" Oct 04 04:49:42 crc kubenswrapper[4574]: I1004 04:49:42.753665 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-stmq5" event={"ID":"833018b5-b584-4e77-a95f-fe56f6dd5945","Type":"ContainerStarted","Data":"b8e67d8e2093486a65034e692699f1cd48821ee2bcecc3ec8690e8547936056e"} Oct 04 04:49:42 crc kubenswrapper[4574]: I1004 04:49:42.753711 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:49:42 crc kubenswrapper[4574]: I1004 04:49:42.753727 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2nmbr" event={"ID":"69b2231e-4f54-4554-8e7a-d46e644d6b81","Type":"ContainerStarted","Data":"d6694f99835e185603343c877391c7bdebe17272f3dd4f702a3d56e6e56f79ab"} Oct 04 04:49:43 crc kubenswrapper[4574]: I1004 04:49:43.747159 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-stmq5" event={"ID":"833018b5-b584-4e77-a95f-fe56f6dd5945","Type":"ContainerStarted","Data":"42485ab910a3d3d49f6309ff09e27e3719f3e85548c2b9df10f93419e2279eef"} Oct 04 04:49:43 crc kubenswrapper[4574]: I1004 04:49:43.747699 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:43 crc kubenswrapper[4574]: I1004 04:49:43.747750 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:45 crc kubenswrapper[4574]: I1004 04:49:45.348498 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:45 crc kubenswrapper[4574]: I1004 04:49:45.348917 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:45 crc kubenswrapper[4574]: I1004 04:49:45.348499 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:45 crc kubenswrapper[4574]: I1004 04:49:45.349141 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:47 crc kubenswrapper[4574]: I1004 04:49:47.763683 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-stmq5" podStartSLOduration=190.7636379 podStartE2EDuration="3m10.7636379s" podCreationTimestamp="2025-10-04 04:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:49:43.767423055 +0000 UTC m=+209.621566157" watchObservedRunningTime="2025-10-04 04:49:47.7636379 +0000 UTC m=+213.617780942" Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.404591 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.405038 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.405100 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.405650 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.405711 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75" gracePeriod=600 Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.784450 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75" exitCode=0 Oct 04 04:49:49 crc kubenswrapper[4574]: I1004 04:49:49.784501 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75"} Oct 04 04:49:50 crc kubenswrapper[4574]: I1004 04:49:50.805097 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"b0fb3a0b8021a1604e8449f303902c5fed8c9c286e729c6c4d14a3c41e72b327"} Oct 04 04:49:52 crc kubenswrapper[4574]: I1004 04:49:52.846215 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerStarted","Data":"9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16"} Oct 04 04:49:52 crc kubenswrapper[4574]: I1004 04:49:52.852568 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerStarted","Data":"aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190"} Oct 04 04:49:52 crc kubenswrapper[4574]: I1004 04:49:52.861402 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerStarted","Data":"569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7"} Oct 04 04:49:53 crc kubenswrapper[4574]: I1004 04:49:53.882847 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerStarted","Data":"eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901"} Oct 04 04:49:53 crc kubenswrapper[4574]: I1004 04:49:53.884899 4574 generic.go:334] "Generic (PLEG): container finished" podID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerID="9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16" exitCode=0 Oct 04 04:49:53 crc kubenswrapper[4574]: I1004 04:49:53.884961 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerDied","Data":"9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16"} Oct 04 04:49:54 crc kubenswrapper[4574]: I1004 04:49:54.905541 4574 generic.go:334] "Generic (PLEG): container finished" podID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerID="aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190" exitCode=0 Oct 04 04:49:54 crc kubenswrapper[4574]: I1004 04:49:54.905610 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerDied","Data":"aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190"} Oct 04 04:49:54 crc kubenswrapper[4574]: I1004 04:49:54.908151 4574 generic.go:334] "Generic (PLEG): container finished" podID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerID="569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7" exitCode=0 Oct 04 04:49:54 crc kubenswrapper[4574]: I1004 04:49:54.908389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerDied","Data":"569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7"} Oct 04 04:49:54 crc kubenswrapper[4574]: I1004 04:49:54.920481 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerStarted","Data":"9d15215c978488ec724a284316b09efd22e5d07cfd257f7f0263af8c76647ddc"} Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.348646 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.348706 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.348799 4574 patch_prober.go:28] interesting pod/downloads-7954f5f757-2nmbr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.348876 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2nmbr" podUID="69b2231e-4f54-4554-8e7a-d46e644d6b81" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.927703 4574 generic.go:334] "Generic (PLEG): container finished" podID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerID="eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901" exitCode=0 Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.927793 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerDied","Data":"eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901"} Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.933616 4574 generic.go:334] "Generic (PLEG): container finished" podID="3548b80e-8db9-4112-a727-6deaf3242864" containerID="9d15215c978488ec724a284316b09efd22e5d07cfd257f7f0263af8c76647ddc" exitCode=0 Oct 04 04:49:55 crc kubenswrapper[4574]: I1004 04:49:55.933673 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerDied","Data":"9d15215c978488ec724a284316b09efd22e5d07cfd257f7f0263af8c76647ddc"} Oct 04 04:50:01 crc kubenswrapper[4574]: I1004 04:50:01.970876 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerStarted","Data":"0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f"} Oct 04 04:50:02 crc kubenswrapper[4574]: I1004 04:50:02.977282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerStarted","Data":"bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7"} Oct 04 04:50:02 crc kubenswrapper[4574]: I1004 04:50:02.979029 4574 generic.go:334] "Generic (PLEG): container finished" podID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerID="0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f" exitCode=0 Oct 04 04:50:02 crc kubenswrapper[4574]: I1004 04:50:02.979111 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerDied","Data":"0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f"} Oct 04 04:50:04 crc kubenswrapper[4574]: I1004 04:50:04.009515 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8sgv5" podStartSLOduration=7.314570446 podStartE2EDuration="1m16.009479481s" podCreationTimestamp="2025-10-04 04:48:48 +0000 UTC" firstStartedPulling="2025-10-04 04:48:51.941647293 +0000 UTC m=+157.795790335" lastFinishedPulling="2025-10-04 04:50:00.636556328 +0000 UTC m=+226.490699370" observedRunningTime="2025-10-04 04:50:03.999429649 +0000 UTC m=+229.853572701" watchObservedRunningTime="2025-10-04 04:50:04.009479481 +0000 UTC m=+229.863622523" Oct 04 04:50:05 crc kubenswrapper[4574]: I1004 04:50:05.356584 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2nmbr" Oct 04 04:50:09 crc kubenswrapper[4574]: I1004 04:50:09.341947 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:50:09 crc kubenswrapper[4574]: I1004 04:50:09.342394 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:50:12 crc kubenswrapper[4574]: I1004 04:50:12.267918 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8sgv5" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="registry-server" probeResult="failure" output=< Oct 04 04:50:12 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 04:50:12 crc kubenswrapper[4574]: > Oct 04 04:50:19 crc kubenswrapper[4574]: I1004 04:50:19.394523 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:50:19 crc kubenswrapper[4574]: I1004 04:50:19.450296 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.076653 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerStarted","Data":"b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.084093 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerStarted","Data":"0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.087563 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerStarted","Data":"e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.093808 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerStarted","Data":"d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.095654 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerStarted","Data":"4e080c27a2d8f40598f51f55483c2f236a547d01cfea28660a8e8d7837095284"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.105077 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerStarted","Data":"dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.110384 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerStarted","Data":"1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba"} Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.139488 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xjdlc" podStartSLOduration=6.284381569 podStartE2EDuration="1m35.139467356s" podCreationTimestamp="2025-10-04 04:48:45 +0000 UTC" firstStartedPulling="2025-10-04 04:48:50.741020823 +0000 UTC m=+156.595163865" lastFinishedPulling="2025-10-04 04:50:19.59610661 +0000 UTC m=+245.450249652" observedRunningTime="2025-10-04 04:50:20.137587243 +0000 UTC m=+245.991730285" watchObservedRunningTime="2025-10-04 04:50:20.139467356 +0000 UTC m=+245.993610398" Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.163100 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-frmzn" podStartSLOduration=8.921570609 podStartE2EDuration="1m35.163077119s" podCreationTimestamp="2025-10-04 04:48:45 +0000 UTC" firstStartedPulling="2025-10-04 04:48:50.768603345 +0000 UTC m=+156.622746387" lastFinishedPulling="2025-10-04 04:50:17.010109855 +0000 UTC m=+242.864252897" observedRunningTime="2025-10-04 04:50:20.159370215 +0000 UTC m=+246.013513267" watchObservedRunningTime="2025-10-04 04:50:20.163077119 +0000 UTC m=+246.017220161" Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.216293 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mvj48" podStartSLOduration=7.394131823 podStartE2EDuration="1m35.216266293s" podCreationTimestamp="2025-10-04 04:48:45 +0000 UTC" firstStartedPulling="2025-10-04 04:48:49.664419411 +0000 UTC m=+155.518562453" lastFinishedPulling="2025-10-04 04:50:17.486553881 +0000 UTC m=+243.340696923" observedRunningTime="2025-10-04 04:50:20.213379542 +0000 UTC m=+246.067522594" watchObservedRunningTime="2025-10-04 04:50:20.216266293 +0000 UTC m=+246.070409345" Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.217493 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4fjgj" podStartSLOduration=3.524036456 podStartE2EDuration="1m31.217484858s" podCreationTimestamp="2025-10-04 04:48:49 +0000 UTC" firstStartedPulling="2025-10-04 04:48:51.901492795 +0000 UTC m=+157.755635837" lastFinishedPulling="2025-10-04 04:50:19.594941197 +0000 UTC m=+245.449084239" observedRunningTime="2025-10-04 04:50:20.191394955 +0000 UTC m=+246.045538007" watchObservedRunningTime="2025-10-04 04:50:20.217484858 +0000 UTC m=+246.071627900" Oct 04 04:50:20 crc kubenswrapper[4574]: I1004 04:50:20.258658 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5xhgz" podStartSLOduration=6.444084796 podStartE2EDuration="1m35.258629574s" podCreationTimestamp="2025-10-04 04:48:45 +0000 UTC" firstStartedPulling="2025-10-04 04:48:50.830482556 +0000 UTC m=+156.684625598" lastFinishedPulling="2025-10-04 04:50:19.645027334 +0000 UTC m=+245.499170376" observedRunningTime="2025-10-04 04:50:20.256509764 +0000 UTC m=+246.110652806" watchObservedRunningTime="2025-10-04 04:50:20.258629574 +0000 UTC m=+246.112772616" Oct 04 04:50:21 crc kubenswrapper[4574]: I1004 04:50:21.120257 4574 generic.go:334] "Generic (PLEG): container finished" podID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerID="1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba" exitCode=0 Oct 04 04:50:21 crc kubenswrapper[4574]: I1004 04:50:21.120363 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerDied","Data":"1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba"} Oct 04 04:50:21 crc kubenswrapper[4574]: I1004 04:50:21.126043 4574 generic.go:334] "Generic (PLEG): container finished" podID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerID="b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3" exitCode=0 Oct 04 04:50:21 crc kubenswrapper[4574]: I1004 04:50:21.126089 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerDied","Data":"b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3"} Oct 04 04:50:22 crc kubenswrapper[4574]: I1004 04:50:22.142972 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerStarted","Data":"1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577"} Oct 04 04:50:22 crc kubenswrapper[4574]: I1004 04:50:22.146477 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerStarted","Data":"ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4"} Oct 04 04:50:22 crc kubenswrapper[4574]: I1004 04:50:22.163369 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-khxqf" podStartSLOduration=4.308334303 podStartE2EDuration="1m35.163351017s" podCreationTimestamp="2025-10-04 04:48:47 +0000 UTC" firstStartedPulling="2025-10-04 04:48:50.802808181 +0000 UTC m=+156.656951223" lastFinishedPulling="2025-10-04 04:50:21.657824895 +0000 UTC m=+247.511967937" observedRunningTime="2025-10-04 04:50:22.162001329 +0000 UTC m=+248.016144371" watchObservedRunningTime="2025-10-04 04:50:22.163351017 +0000 UTC m=+248.017494059" Oct 04 04:50:25 crc kubenswrapper[4574]: I1004 04:50:25.731174 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:50:25 crc kubenswrapper[4574]: I1004 04:50:25.731586 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:50:25 crc kubenswrapper[4574]: I1004 04:50:25.778036 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:50:25 crc kubenswrapper[4574]: I1004 04:50:25.801866 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qtgfg" podStartSLOduration=7.920643822 podStartE2EDuration="1m38.801844761s" podCreationTimestamp="2025-10-04 04:48:47 +0000 UTC" firstStartedPulling="2025-10-04 04:48:50.703310476 +0000 UTC m=+156.557453518" lastFinishedPulling="2025-10-04 04:50:21.584511415 +0000 UTC m=+247.438654457" observedRunningTime="2025-10-04 04:50:22.191349563 +0000 UTC m=+248.045492615" watchObservedRunningTime="2025-10-04 04:50:25.801844761 +0000 UTC m=+251.655987803" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.216589 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.332489 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.332544 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.373311 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.460294 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.460412 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.497913 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.729069 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.729377 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:50:26 crc kubenswrapper[4574]: I1004 04:50:26.770162 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:50:27 crc kubenswrapper[4574]: I1004 04:50:27.218518 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:50:27 crc kubenswrapper[4574]: I1004 04:50:27.219044 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:50:27 crc kubenswrapper[4574]: I1004 04:50:27.230605 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:50:27 crc kubenswrapper[4574]: I1004 04:50:27.936271 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:50:27 crc kubenswrapper[4574]: I1004 04:50:27.936364 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:50:27 crc kubenswrapper[4574]: I1004 04:50:27.990739 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:50:28 crc kubenswrapper[4574]: I1004 04:50:28.215719 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:50:28 crc kubenswrapper[4574]: I1004 04:50:28.215776 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:50:28 crc kubenswrapper[4574]: I1004 04:50:28.226570 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:50:28 crc kubenswrapper[4574]: I1004 04:50:28.272743 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:50:28 crc kubenswrapper[4574]: I1004 04:50:28.635552 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjdlc"] Oct 04 04:50:28 crc kubenswrapper[4574]: I1004 04:50:28.832281 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xhgz"] Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.189322 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xjdlc" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="registry-server" containerID="cri-o://d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3" gracePeriod=2 Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.189899 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5xhgz" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="registry-server" containerID="cri-o://dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5" gracePeriod=2 Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.236732 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.588917 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.622653 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-utilities\") pod \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.622714 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g9ll\" (UniqueName: \"kubernetes.io/projected/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-kube-api-access-2g9ll\") pod \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.622792 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-catalog-content\") pod \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\" (UID: \"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de\") " Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.624401 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-utilities" (OuterVolumeSpecName: "utilities") pod "49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" (UID: "49a1d35e-a6a7-4d50-9ec1-90c3ff9295de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.631590 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-kube-api-access-2g9ll" (OuterVolumeSpecName: "kube-api-access-2g9ll") pod "49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" (UID: "49a1d35e-a6a7-4d50-9ec1-90c3ff9295de"). InnerVolumeSpecName "kube-api-access-2g9ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.679486 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.688985 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" (UID: "49a1d35e-a6a7-4d50-9ec1-90c3ff9295de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.723693 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6fcx\" (UniqueName: \"kubernetes.io/projected/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-kube-api-access-x6fcx\") pod \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.723749 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-utilities\") pod \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.723837 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-catalog-content\") pod \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\" (UID: \"93ad9da7-d07b-4a0c-8f91-6af543b99e3e\") " Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.724118 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.724134 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g9ll\" (UniqueName: \"kubernetes.io/projected/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-kube-api-access-2g9ll\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.724145 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.724727 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-utilities" (OuterVolumeSpecName: "utilities") pod "93ad9da7-d07b-4a0c-8f91-6af543b99e3e" (UID: "93ad9da7-d07b-4a0c-8f91-6af543b99e3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.726145 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-kube-api-access-x6fcx" (OuterVolumeSpecName: "kube-api-access-x6fcx") pod "93ad9da7-d07b-4a0c-8f91-6af543b99e3e" (UID: "93ad9da7-d07b-4a0c-8f91-6af543b99e3e"). InnerVolumeSpecName "kube-api-access-x6fcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.772899 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93ad9da7-d07b-4a0c-8f91-6af543b99e3e" (UID: "93ad9da7-d07b-4a0c-8f91-6af543b99e3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.825998 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.826036 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:29 crc kubenswrapper[4574]: I1004 04:50:29.826075 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6fcx\" (UniqueName: \"kubernetes.io/projected/93ad9da7-d07b-4a0c-8f91-6af543b99e3e-kube-api-access-x6fcx\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.060010 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.060089 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.107603 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.195929 4574 generic.go:334] "Generic (PLEG): container finished" podID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerID="dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5" exitCode=0 Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.196006 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerDied","Data":"dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5"} Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.196037 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xhgz" event={"ID":"93ad9da7-d07b-4a0c-8f91-6af543b99e3e","Type":"ContainerDied","Data":"69531635c2cff6d893e09b18b5d6fa4d48d845de0936ffb6c5322218936e0943"} Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.196058 4574 scope.go:117] "RemoveContainer" containerID="dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.196751 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xhgz" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.202622 4574 generic.go:334] "Generic (PLEG): container finished" podID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerID="d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3" exitCode=0 Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.202705 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjdlc" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.202773 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerDied","Data":"d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3"} Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.202871 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjdlc" event={"ID":"49a1d35e-a6a7-4d50-9ec1-90c3ff9295de","Type":"ContainerDied","Data":"cc78f3f93534e373d7a1f1ca8e4a2edb60455a9673c7cc6eda9ccd1d293f0865"} Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.231494 4574 scope.go:117] "RemoveContainer" containerID="0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.249913 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xhgz"] Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.260291 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5xhgz"] Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.266218 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjdlc"] Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.270566 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.276992 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xjdlc"] Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.280406 4574 scope.go:117] "RemoveContainer" containerID="43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.312361 4574 scope.go:117] "RemoveContainer" containerID="dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5" Oct 04 04:50:30 crc kubenswrapper[4574]: E1004 04:50:30.313152 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5\": container with ID starting with dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5 not found: ID does not exist" containerID="dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.313191 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5"} err="failed to get container status \"dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5\": rpc error: code = NotFound desc = could not find container \"dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5\": container with ID starting with dcac46a9546464941bfddaca0a34f1291cd2e0e96a4a405fc1177a2ae51ceeb5 not found: ID does not exist" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.313217 4574 scope.go:117] "RemoveContainer" containerID="0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f" Oct 04 04:50:30 crc kubenswrapper[4574]: E1004 04:50:30.313561 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f\": container with ID starting with 0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f not found: ID does not exist" containerID="0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.313589 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f"} err="failed to get container status \"0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f\": rpc error: code = NotFound desc = could not find container \"0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f\": container with ID starting with 0a01a0edc6cd2a544b86e61e797f32646f6e4f60ff81335420c9423d64e97d5f not found: ID does not exist" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.313605 4574 scope.go:117] "RemoveContainer" containerID="43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5" Oct 04 04:50:30 crc kubenswrapper[4574]: E1004 04:50:30.313972 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5\": container with ID starting with 43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5 not found: ID does not exist" containerID="43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.313995 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5"} err="failed to get container status \"43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5\": rpc error: code = NotFound desc = could not find container \"43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5\": container with ID starting with 43ef34f85e5594a4f2c9c0b45adb0294472dda08d30e013c90e42470c13f12d5 not found: ID does not exist" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.314011 4574 scope.go:117] "RemoveContainer" containerID="d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.348607 4574 scope.go:117] "RemoveContainer" containerID="9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.367781 4574 scope.go:117] "RemoveContainer" containerID="bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.391601 4574 scope.go:117] "RemoveContainer" containerID="d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3" Oct 04 04:50:30 crc kubenswrapper[4574]: E1004 04:50:30.393431 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3\": container with ID starting with d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3 not found: ID does not exist" containerID="d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.393469 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3"} err="failed to get container status \"d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3\": rpc error: code = NotFound desc = could not find container \"d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3\": container with ID starting with d0289b0869eab0ab98134e131445b3bf9829456bf97cd4879ba79f855c3ac2c3 not found: ID does not exist" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.393499 4574 scope.go:117] "RemoveContainer" containerID="9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16" Oct 04 04:50:30 crc kubenswrapper[4574]: E1004 04:50:30.393819 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16\": container with ID starting with 9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16 not found: ID does not exist" containerID="9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.393842 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16"} err="failed to get container status \"9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16\": rpc error: code = NotFound desc = could not find container \"9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16\": container with ID starting with 9e002c7a7a8fb0d2d0161df8a81c86bd3cc5e0a7ccc9484e710537c59d4ebd16 not found: ID does not exist" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.393859 4574 scope.go:117] "RemoveContainer" containerID="bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8" Oct 04 04:50:30 crc kubenswrapper[4574]: E1004 04:50:30.394063 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8\": container with ID starting with bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8 not found: ID does not exist" containerID="bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.394097 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8"} err="failed to get container status \"bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8\": rpc error: code = NotFound desc = could not find container \"bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8\": container with ID starting with bf5dec21b423cd0200dfdfb579409213c2df2bc326cca7f9fbf05559f23b7db8 not found: ID does not exist" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.741778 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" path="/var/lib/kubelet/pods/49a1d35e-a6a7-4d50-9ec1-90c3ff9295de/volumes" Oct 04 04:50:30 crc kubenswrapper[4574]: I1004 04:50:30.742519 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" path="/var/lib/kubelet/pods/93ad9da7-d07b-4a0c-8f91-6af543b99e3e/volumes" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.032457 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-khxqf"] Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.210717 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-khxqf" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="registry-server" containerID="cri-o://1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577" gracePeriod=2 Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.834278 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.853326 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hkc2\" (UniqueName: \"kubernetes.io/projected/aaec2754-49e7-4b88-b913-1c19269e6b97-kube-api-access-6hkc2\") pod \"aaec2754-49e7-4b88-b913-1c19269e6b97\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.853476 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-utilities\") pod \"aaec2754-49e7-4b88-b913-1c19269e6b97\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.853521 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-catalog-content\") pod \"aaec2754-49e7-4b88-b913-1c19269e6b97\" (UID: \"aaec2754-49e7-4b88-b913-1c19269e6b97\") " Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.854383 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-utilities" (OuterVolumeSpecName: "utilities") pod "aaec2754-49e7-4b88-b913-1c19269e6b97" (UID: "aaec2754-49e7-4b88-b913-1c19269e6b97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.860566 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaec2754-49e7-4b88-b913-1c19269e6b97-kube-api-access-6hkc2" (OuterVolumeSpecName: "kube-api-access-6hkc2") pod "aaec2754-49e7-4b88-b913-1c19269e6b97" (UID: "aaec2754-49e7-4b88-b913-1c19269e6b97"). InnerVolumeSpecName "kube-api-access-6hkc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.872462 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaec2754-49e7-4b88-b913-1c19269e6b97" (UID: "aaec2754-49e7-4b88-b913-1c19269e6b97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.955417 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hkc2\" (UniqueName: \"kubernetes.io/projected/aaec2754-49e7-4b88-b913-1c19269e6b97-kube-api-access-6hkc2\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.955738 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:31 crc kubenswrapper[4574]: I1004 04:50:31.955819 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaec2754-49e7-4b88-b913-1c19269e6b97-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.218591 4574 generic.go:334] "Generic (PLEG): container finished" podID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerID="1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577" exitCode=0 Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.218649 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khxqf" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.218666 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerDied","Data":"1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577"} Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.220458 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khxqf" event={"ID":"aaec2754-49e7-4b88-b913-1c19269e6b97","Type":"ContainerDied","Data":"5efc7f24b77bc3aff4c239b70dfb4efdd49573810f6260b9703c166ccac78565"} Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.220553 4574 scope.go:117] "RemoveContainer" containerID="1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.242803 4574 scope.go:117] "RemoveContainer" containerID="1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.253633 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-khxqf"] Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.259684 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-khxqf"] Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.266615 4574 scope.go:117] "RemoveContainer" containerID="53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.285599 4574 scope.go:117] "RemoveContainer" containerID="1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577" Oct 04 04:50:32 crc kubenswrapper[4574]: E1004 04:50:32.286440 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577\": container with ID starting with 1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577 not found: ID does not exist" containerID="1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.286476 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577"} err="failed to get container status \"1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577\": rpc error: code = NotFound desc = could not find container \"1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577\": container with ID starting with 1d10f409e02003503bad7200283cf3673ebfa80bacea94bec090dfba0337c577 not found: ID does not exist" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.286504 4574 scope.go:117] "RemoveContainer" containerID="1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba" Oct 04 04:50:32 crc kubenswrapper[4574]: E1004 04:50:32.287095 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba\": container with ID starting with 1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba not found: ID does not exist" containerID="1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.287299 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba"} err="failed to get container status \"1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba\": rpc error: code = NotFound desc = could not find container \"1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba\": container with ID starting with 1c7416b17dbc909a617c8ae15bce163aaa76a7eeaab260396127f343098239ba not found: ID does not exist" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.287419 4574 scope.go:117] "RemoveContainer" containerID="53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627" Oct 04 04:50:32 crc kubenswrapper[4574]: E1004 04:50:32.288147 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627\": container with ID starting with 53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627 not found: ID does not exist" containerID="53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.288208 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627"} err="failed to get container status \"53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627\": rpc error: code = NotFound desc = could not find container \"53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627\": container with ID starting with 53eb09620bd1986e90deb24e96879da2788b492b134d3cadc8d06be83f030627 not found: ID does not exist" Oct 04 04:50:32 crc kubenswrapper[4574]: I1004 04:50:32.742639 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" path="/var/lib/kubelet/pods/aaec2754-49e7-4b88-b913-1c19269e6b97/volumes" Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.436778 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4fjgj"] Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.437077 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4fjgj" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="registry-server" containerID="cri-o://e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c" gracePeriod=2 Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.937837 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.980973 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-utilities\") pod \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.981085 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpc8w\" (UniqueName: \"kubernetes.io/projected/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-kube-api-access-dpc8w\") pod \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.981134 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-catalog-content\") pod \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\" (UID: \"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005\") " Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.985367 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-utilities" (OuterVolumeSpecName: "utilities") pod "738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" (UID: "738405ee-2a5a-4ae0-a9aa-cdbad4fc0005"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:33 crc kubenswrapper[4574]: I1004 04:50:33.989920 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-kube-api-access-dpc8w" (OuterVolumeSpecName: "kube-api-access-dpc8w") pod "738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" (UID: "738405ee-2a5a-4ae0-a9aa-cdbad4fc0005"). InnerVolumeSpecName "kube-api-access-dpc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.081808 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.082076 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpc8w\" (UniqueName: \"kubernetes.io/projected/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-kube-api-access-dpc8w\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.084669 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" (UID: "738405ee-2a5a-4ae0-a9aa-cdbad4fc0005"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.183307 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.244458 4574 generic.go:334] "Generic (PLEG): container finished" podID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerID="e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c" exitCode=0 Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.244535 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerDied","Data":"e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c"} Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.244606 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4fjgj" event={"ID":"738405ee-2a5a-4ae0-a9aa-cdbad4fc0005","Type":"ContainerDied","Data":"321f96435a023e2b21d8ed461ef18d1a656e7f1eea05c963975802ebd733f893"} Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.244659 4574 scope.go:117] "RemoveContainer" containerID="e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.252417 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4fjgj" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.284419 4574 scope.go:117] "RemoveContainer" containerID="569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.304332 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4fjgj"] Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.305684 4574 scope.go:117] "RemoveContainer" containerID="9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.324158 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4fjgj"] Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.329228 4574 scope.go:117] "RemoveContainer" containerID="e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c" Oct 04 04:50:34 crc kubenswrapper[4574]: E1004 04:50:34.329817 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c\": container with ID starting with e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c not found: ID does not exist" containerID="e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.329856 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c"} err="failed to get container status \"e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c\": rpc error: code = NotFound desc = could not find container \"e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c\": container with ID starting with e7b4be1dea32b96544c4706abdffc17c47318ad36fdca489d5d8d3c5aea52c7c not found: ID does not exist" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.329889 4574 scope.go:117] "RemoveContainer" containerID="569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7" Oct 04 04:50:34 crc kubenswrapper[4574]: E1004 04:50:34.330116 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7\": container with ID starting with 569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7 not found: ID does not exist" containerID="569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.330141 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7"} err="failed to get container status \"569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7\": rpc error: code = NotFound desc = could not find container \"569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7\": container with ID starting with 569ac6dc92fc1cb0a30d0c26e45451e8cb1a621ab8341ec9a72ec9522167e7d7 not found: ID does not exist" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.330158 4574 scope.go:117] "RemoveContainer" containerID="9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589" Oct 04 04:50:34 crc kubenswrapper[4574]: E1004 04:50:34.330620 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589\": container with ID starting with 9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589 not found: ID does not exist" containerID="9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.330733 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589"} err="failed to get container status \"9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589\": rpc error: code = NotFound desc = could not find container \"9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589\": container with ID starting with 9092e24a502a1d2f9e623db2d4bf2e2e72c29a6d1fa5ab024e48b89d134a6589 not found: ID does not exist" Oct 04 04:50:34 crc kubenswrapper[4574]: I1004 04:50:34.742019 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" path="/var/lib/kubelet/pods/738405ee-2a5a-4ae0-a9aa-cdbad4fc0005/volumes" Oct 04 04:50:54 crc kubenswrapper[4574]: I1004 04:50:54.938000 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pcvkf"] Oct 04 04:51:19 crc kubenswrapper[4574]: I1004 04:51:19.969428 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" containerID="cri-o://57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c" gracePeriod=15 Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.306754 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341032 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-77c8c5f65c-v2548"] Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341708 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341727 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341739 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341746 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341754 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341761 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341771 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341777 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341787 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341793 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341801 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8" containerName="pruner" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341807 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8" containerName="pruner" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341816 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341821 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341833 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341839 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341847 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341854 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341865 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341871 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341879 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341887 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341896 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341901 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="extract-utilities" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341910 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341916 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.341924 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.341931 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="extract-content" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342023 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ad9da7-d07b-4a0c-8f91-6af543b99e3e" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342038 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaec2754-49e7-4b88-b913-1c19269e6b97" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342046 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a1d35e-a6a7-4d50-9ec1-90c3ff9295de" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342055 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerName="oauth-openshift" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342061 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="738405ee-2a5a-4ae0-a9aa-cdbad4fc0005" containerName="registry-server" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342073 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff96b52b-d1d1-4162-8cc5-b97d7b54c2d8" containerName="pruner" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.342585 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.353714 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77c8c5f65c-v2548"] Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433143 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-service-ca\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433216 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-provider-selection\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433415 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-router-certs\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433623 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-trusted-ca-bundle\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433670 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-session\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433702 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-dir\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433758 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-idp-0-file-data\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433794 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tscnx\" (UniqueName: \"kubernetes.io/projected/34e83d3a-faaf-4720-85d2-1430c65810fd-kube-api-access-tscnx\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433857 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-login\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433890 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-serving-cert\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.434521 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.434552 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.433913 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-error\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.434795 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-ocp-branding-template\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.434825 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-policies\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.434865 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-cliconfig\") pod \"34e83d3a-faaf-4720-85d2-1430c65810fd\" (UID: \"34e83d3a-faaf-4720-85d2-1430c65810fd\") " Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.435431 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.435456 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.435849 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.435992 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.436446 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.440317 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.440779 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.441174 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e83d3a-faaf-4720-85d2-1430c65810fd-kube-api-access-tscnx" (OuterVolumeSpecName: "kube-api-access-tscnx") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "kube-api-access-tscnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.441380 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.441665 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.441829 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.442204 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.442961 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.443251 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "34e83d3a-faaf-4720-85d2-1430c65810fd" (UID: "34e83d3a-faaf-4720-85d2-1430c65810fd"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.496656 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.497261 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" event={"ID":"34e83d3a-faaf-4720-85d2-1430c65810fd","Type":"ContainerDied","Data":"57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c"} Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.497122 4574 generic.go:334] "Generic (PLEG): container finished" podID="34e83d3a-faaf-4720-85d2-1430c65810fd" containerID="57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c" exitCode=0 Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.499375 4574 scope.go:117] "RemoveContainer" containerID="57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.499414 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pcvkf" event={"ID":"34e83d3a-faaf-4720-85d2-1430c65810fd","Type":"ContainerDied","Data":"f59bcf1f1142884018f77030b0a4dcccda99357764470d2955906650f947264f"} Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.529460 4574 scope.go:117] "RemoveContainer" containerID="57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c" Oct 04 04:51:20 crc kubenswrapper[4574]: E1004 04:51:20.529911 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c\": container with ID starting with 57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c not found: ID does not exist" containerID="57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.529941 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c"} err="failed to get container status \"57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c\": rpc error: code = NotFound desc = could not find container \"57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c\": container with ID starting with 57f73336b185a0297d540faba67e8a8c3b1809941fcdecb9d62c4fcd61dd9f5c not found: ID does not exist" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.532863 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pcvkf"] Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536850 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536885 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-error\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536905 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-audit-policies\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536924 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536949 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536971 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.536993 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-audit-dir\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537018 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537042 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-router-certs\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537060 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-session\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537091 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537109 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cjlc\" (UniqueName: \"kubernetes.io/projected/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-kube-api-access-2cjlc\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537142 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-service-ca\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537163 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-login\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537209 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537223 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537278 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537292 4574 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537302 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537313 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tscnx\" (UniqueName: \"kubernetes.io/projected/34e83d3a-faaf-4720-85d2-1430c65810fd-kube-api-access-tscnx\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537324 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537335 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537346 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537358 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537370 4574 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.537381 4574 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34e83d3a-faaf-4720-85d2-1430c65810fd-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.538406 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pcvkf"] Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639042 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639089 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjlc\" (UniqueName: \"kubernetes.io/projected/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-kube-api-access-2cjlc\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639110 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-service-ca\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639129 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-login\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639176 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639197 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-error\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639217 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-audit-policies\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639260 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639280 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639297 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639316 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-audit-dir\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639332 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639348 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-router-certs\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639366 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-session\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.639927 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-audit-dir\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.640489 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.641085 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.641642 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-audit-policies\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.641986 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-service-ca\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.643751 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.643856 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-session\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.645720 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.645822 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-router-certs\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.645999 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-error\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.646697 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.647887 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.648781 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-v4-0-config-user-template-login\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.657594 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cjlc\" (UniqueName: \"kubernetes.io/projected/a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab-kube-api-access-2cjlc\") pod \"oauth-openshift-77c8c5f65c-v2548\" (UID: \"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab\") " pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.664517 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.755698 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e83d3a-faaf-4720-85d2-1430c65810fd" path="/var/lib/kubelet/pods/34e83d3a-faaf-4720-85d2-1430c65810fd/volumes" Oct 04 04:51:20 crc kubenswrapper[4574]: I1004 04:51:20.863522 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-77c8c5f65c-v2548"] Oct 04 04:51:21 crc kubenswrapper[4574]: I1004 04:51:21.505695 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" event={"ID":"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab","Type":"ContainerStarted","Data":"3eb0f569ab8c7f128d2d20bb5fd25957d36b5482b64f855d56b34df037d3695f"} Oct 04 04:51:21 crc kubenswrapper[4574]: I1004 04:51:21.506111 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" event={"ID":"a5aabd6d-e899-4d55-9fd2-9f8c01a8eeab","Type":"ContainerStarted","Data":"8d3254769f399af9b36ce22a522e47ad7eededb78f128d85b4235a2e85e1b2ae"} Oct 04 04:51:21 crc kubenswrapper[4574]: I1004 04:51:21.506133 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:21 crc kubenswrapper[4574]: I1004 04:51:21.534220 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" podStartSLOduration=27.534197331 podStartE2EDuration="27.534197331s" podCreationTimestamp="2025-10-04 04:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:51:21.533501299 +0000 UTC m=+307.387644361" watchObservedRunningTime="2025-10-04 04:51:21.534197331 +0000 UTC m=+307.388340373" Oct 04 04:51:21 crc kubenswrapper[4574]: I1004 04:51:21.666026 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-77c8c5f65c-v2548" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.666572 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvj48"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.667733 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mvj48" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="registry-server" containerID="cri-o://0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f" gracePeriod=30 Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.670617 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frmzn"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.670917 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-frmzn" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="registry-server" containerID="cri-o://4e080c27a2d8f40598f51f55483c2f236a547d01cfea28660a8e8d7837095284" gracePeriod=30 Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.682298 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-44hzk"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.682557 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" containerID="cri-o://c37f28ff0ee9f99ff699444636f31d08ee1e6134fb1a44d86f3eed97114865bc" gracePeriod=30 Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.697393 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qtgfg"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.697685 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qtgfg" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="registry-server" containerID="cri-o://ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" gracePeriod=30 Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.713441 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8sgv5"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.713795 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8sgv5" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="registry-server" containerID="cri-o://bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7" gracePeriod=30 Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.729458 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-28wcm"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.730326 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.761904 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-28wcm"] Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.886003 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sptl\" (UniqueName: \"kubernetes.io/projected/40f47671-d6bd-402e-8003-3688245aa0ed-kube-api-access-5sptl\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.886119 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40f47671-d6bd-402e-8003-3688245aa0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.886179 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40f47671-d6bd-402e-8003-3688245aa0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: E1004 04:51:37.937757 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4 is running failed: container process not found" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" cmd=["grpc_health_probe","-addr=:50051"] Oct 04 04:51:37 crc kubenswrapper[4574]: E1004 04:51:37.938516 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4 is running failed: container process not found" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" cmd=["grpc_health_probe","-addr=:50051"] Oct 04 04:51:37 crc kubenswrapper[4574]: E1004 04:51:37.938908 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4 is running failed: container process not found" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" cmd=["grpc_health_probe","-addr=:50051"] Oct 04 04:51:37 crc kubenswrapper[4574]: E1004 04:51:37.938937 4574 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-qtgfg" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="registry-server" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.987358 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sptl\" (UniqueName: \"kubernetes.io/projected/40f47671-d6bd-402e-8003-3688245aa0ed-kube-api-access-5sptl\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.987524 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40f47671-d6bd-402e-8003-3688245aa0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.987605 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40f47671-d6bd-402e-8003-3688245aa0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:37 crc kubenswrapper[4574]: I1004 04:51:37.989560 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40f47671-d6bd-402e-8003-3688245aa0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.000569 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4de8b5_1b24_4ccd_bb4c_5b6bba86e4d6.slice/crio-ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea7d0ff_1f3f_469a_8600_7393bb8ec4c7.slice/crio-conmon-0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a786e2_3629_456c_a861_3e5abcd343a2.slice/crio-bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29aee87b_0598_4b50_9b1a_beacaf6d7275.slice/crio-conmon-c37f28ff0ee9f99ff699444636f31d08ee1e6134fb1a44d86f3eed97114865bc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a786e2_3629_456c_a861_3e5abcd343a2.slice/crio-conmon-bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7.scope\": RecentStats: unable to find data in memory cache]" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.004057 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40f47671-d6bd-402e-8003-3688245aa0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.010600 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sptl\" (UniqueName: \"kubernetes.io/projected/40f47671-d6bd-402e-8003-3688245aa0ed-kube-api-access-5sptl\") pod \"marketplace-operator-79b997595-28wcm\" (UID: \"40f47671-d6bd-402e-8003-3688245aa0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.051263 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.206785 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.291789 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-utilities\") pod \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.291892 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvdzq\" (UniqueName: \"kubernetes.io/projected/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-kube-api-access-bvdzq\") pod \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.291943 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-catalog-content\") pod \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\" (UID: \"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.294089 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-utilities" (OuterVolumeSpecName: "utilities") pod "4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" (UID: "4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.295278 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.311675 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-kube-api-access-bvdzq" (OuterVolumeSpecName: "kube-api-access-bvdzq") pod "4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" (UID: "4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7"). InnerVolumeSpecName "kube-api-access-bvdzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.348735 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.366993 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.397985 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvdzq\" (UniqueName: \"kubernetes.io/projected/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-kube-api-access-bvdzq\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.403563 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" (UID: "4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.446243 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-28wcm"] Oct 04 04:51:38 crc kubenswrapper[4574]: W1004 04:51:38.452385 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f47671_d6bd_402e_8003_3688245aa0ed.slice/crio-3019eefaff6f5e20969242762cdf71a1f93e1da0bd0e1fb328eaab4052ff8f9b WatchSource:0}: Error finding container 3019eefaff6f5e20969242762cdf71a1f93e1da0bd0e1fb328eaab4052ff8f9b: Status 404 returned error can't find the container with id 3019eefaff6f5e20969242762cdf71a1f93e1da0bd0e1fb328eaab4052ff8f9b Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499003 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-catalog-content\") pod \"b7a786e2-3629-456c-a861-3e5abcd343a2\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499118 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-utilities\") pod \"b7a786e2-3629-456c-a861-3e5abcd343a2\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499144 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m9n9\" (UniqueName: \"kubernetes.io/projected/b7a786e2-3629-456c-a861-3e5abcd343a2-kube-api-access-2m9n9\") pod \"b7a786e2-3629-456c-a861-3e5abcd343a2\" (UID: \"b7a786e2-3629-456c-a861-3e5abcd343a2\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499189 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-catalog-content\") pod \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499342 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bttk\" (UniqueName: \"kubernetes.io/projected/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-kube-api-access-6bttk\") pod \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499388 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-utilities\") pod \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\" (UID: \"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6\") " Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499660 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.499919 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-utilities" (OuterVolumeSpecName: "utilities") pod "b7a786e2-3629-456c-a861-3e5abcd343a2" (UID: "b7a786e2-3629-456c-a861-3e5abcd343a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.500790 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-utilities" (OuterVolumeSpecName: "utilities") pod "5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" (UID: "5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.505144 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-kube-api-access-6bttk" (OuterVolumeSpecName: "kube-api-access-6bttk") pod "5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" (UID: "5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6"). InnerVolumeSpecName "kube-api-access-6bttk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.506243 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a786e2-3629-456c-a861-3e5abcd343a2-kube-api-access-2m9n9" (OuterVolumeSpecName: "kube-api-access-2m9n9") pod "b7a786e2-3629-456c-a861-3e5abcd343a2" (UID: "b7a786e2-3629-456c-a861-3e5abcd343a2"). InnerVolumeSpecName "kube-api-access-2m9n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.525693 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" (UID: "5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.602263 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bttk\" (UniqueName: \"kubernetes.io/projected/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-kube-api-access-6bttk\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.606093 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7a786e2-3629-456c-a861-3e5abcd343a2" (UID: "b7a786e2-3629-456c-a861-3e5abcd343a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.611467 4574 generic.go:334] "Generic (PLEG): container finished" podID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerID="bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7" exitCode=0 Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.611504 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerDied","Data":"bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.629462 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sgv5" event={"ID":"b7a786e2-3629-456c-a861-3e5abcd343a2","Type":"ContainerDied","Data":"82336a25cb66e539c3b2750f4d03245599e8be5a9f7ea895fc31dbab4081ea81"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.629501 4574 scope.go:117] "RemoveContainer" containerID="bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.611589 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sgv5" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.635272 4574 generic.go:334] "Generic (PLEG): container finished" podID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" exitCode=0 Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.635357 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerDied","Data":"ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.635394 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qtgfg" event={"ID":"5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6","Type":"ContainerDied","Data":"9714482480fcdf3560fbc0b7d35dc77e75a7604a822174bfdace1b7c711e8d4a"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.635473 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qtgfg" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.646306 4574 generic.go:334] "Generic (PLEG): container finished" podID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerID="0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f" exitCode=0 Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.646394 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerDied","Data":"0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.646439 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvj48" event={"ID":"4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7","Type":"ContainerDied","Data":"a8b270f981663ace03ddc252409f72167850fd89d6a4a1b6459fa4a630d2a31c"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.646543 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvj48" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.656454 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.656650 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.656741 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m9n9\" (UniqueName: \"kubernetes.io/projected/b7a786e2-3629-456c-a861-3e5abcd343a2-kube-api-access-2m9n9\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.656829 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.660058 4574 generic.go:334] "Generic (PLEG): container finished" podID="3548b80e-8db9-4112-a727-6deaf3242864" containerID="4e080c27a2d8f40598f51f55483c2f236a547d01cfea28660a8e8d7837095284" exitCode=0 Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.660227 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerDied","Data":"4e080c27a2d8f40598f51f55483c2f236a547d01cfea28660a8e8d7837095284"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.661314 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" event={"ID":"40f47671-d6bd-402e-8003-3688245aa0ed","Type":"ContainerStarted","Data":"3019eefaff6f5e20969242762cdf71a1f93e1da0bd0e1fb328eaab4052ff8f9b"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.661557 4574 scope.go:117] "RemoveContainer" containerID="aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.674130 4574 generic.go:334] "Generic (PLEG): container finished" podID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerID="c37f28ff0ee9f99ff699444636f31d08ee1e6134fb1a44d86f3eed97114865bc" exitCode=0 Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.674625 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" event={"ID":"29aee87b-0598-4b50-9b1a-beacaf6d7275","Type":"ContainerDied","Data":"c37f28ff0ee9f99ff699444636f31d08ee1e6134fb1a44d86f3eed97114865bc"} Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.701729 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qtgfg"] Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.705960 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qtgfg"] Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.716556 4574 scope.go:117] "RemoveContainer" containerID="3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.721350 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8sgv5"] Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.724969 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8sgv5"] Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.755693 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" path="/var/lib/kubelet/pods/5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6/volumes" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.757140 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" path="/var/lib/kubelet/pods/b7a786e2-3629-456c-a861-3e5abcd343a2/volumes" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.759063 4574 scope.go:117] "RemoveContainer" containerID="bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.759266 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvj48"] Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.759776 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a786e2-3629-456c-a861-3e5abcd343a2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.762746 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mvj48"] Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.763767 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7\": container with ID starting with bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7 not found: ID does not exist" containerID="bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.763812 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7"} err="failed to get container status \"bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7\": rpc error: code = NotFound desc = could not find container \"bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7\": container with ID starting with bbd1b4bc011d797512273a69a8391de7a925c765f5551e7ebf2f7f93415c13e7 not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.763846 4574 scope.go:117] "RemoveContainer" containerID="aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.767006 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190\": container with ID starting with aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190 not found: ID does not exist" containerID="aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.767078 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190"} err="failed to get container status \"aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190\": rpc error: code = NotFound desc = could not find container \"aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190\": container with ID starting with aaafee1a5189df88061c97c4ad3df453ab2cd457cedd44953dd2b7cf3e0f9190 not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.767127 4574 scope.go:117] "RemoveContainer" containerID="3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.772531 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2\": container with ID starting with 3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2 not found: ID does not exist" containerID="3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.772578 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2"} err="failed to get container status \"3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2\": rpc error: code = NotFound desc = could not find container \"3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2\": container with ID starting with 3b898a0a83ad0e1ffd7a585e32c435d89430a8e9c6ed7d21d8d50701335199f2 not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.772610 4574 scope.go:117] "RemoveContainer" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.797429 4574 scope.go:117] "RemoveContainer" containerID="b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.835476 4574 scope.go:117] "RemoveContainer" containerID="11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.859814 4574 scope.go:117] "RemoveContainer" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.865507 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4\": container with ID starting with ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4 not found: ID does not exist" containerID="ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.865569 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4"} err="failed to get container status \"ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4\": rpc error: code = NotFound desc = could not find container \"ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4\": container with ID starting with ce5a3de655216b946c34b1914a569a16dfc0e2b8dab304ced63bacaf5f08d4d4 not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.865603 4574 scope.go:117] "RemoveContainer" containerID="b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.867649 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3\": container with ID starting with b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3 not found: ID does not exist" containerID="b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.867695 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3"} err="failed to get container status \"b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3\": rpc error: code = NotFound desc = could not find container \"b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3\": container with ID starting with b7e64539790d60969562ecf273c12b5a0c2f7e181bfd423b319bcbb6b0d132e3 not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.867731 4574 scope.go:117] "RemoveContainer" containerID="11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.869975 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d\": container with ID starting with 11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d not found: ID does not exist" containerID="11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.870014 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d"} err="failed to get container status \"11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d\": rpc error: code = NotFound desc = could not find container \"11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d\": container with ID starting with 11bd6996289269bb122dd5147b13fc2cd955c0910dbf7f0739dcd00632c98e4d not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.870041 4574 scope.go:117] "RemoveContainer" containerID="0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.908324 4574 scope.go:117] "RemoveContainer" containerID="eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.947579 4574 scope.go:117] "RemoveContainer" containerID="490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.978303 4574 scope.go:117] "RemoveContainer" containerID="0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.985384 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f\": container with ID starting with 0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f not found: ID does not exist" containerID="0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.985441 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f"} err="failed to get container status \"0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f\": rpc error: code = NotFound desc = could not find container \"0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f\": container with ID starting with 0ac40aa9670a40e4b5ece3a5c256e866e1ed9143e9bd1674221204b7ed08223f not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.985486 4574 scope.go:117] "RemoveContainer" containerID="eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.986491 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901\": container with ID starting with eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901 not found: ID does not exist" containerID="eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.986517 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901"} err="failed to get container status \"eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901\": rpc error: code = NotFound desc = could not find container \"eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901\": container with ID starting with eee8c1ecc97cfe2c1865d0169b7e6d674677f16a56653b86acb5daa29192c901 not found: ID does not exist" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.986532 4574 scope.go:117] "RemoveContainer" containerID="490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2" Oct 04 04:51:38 crc kubenswrapper[4574]: E1004 04:51:38.986822 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2\": container with ID starting with 490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2 not found: ID does not exist" containerID="490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2" Oct 04 04:51:38 crc kubenswrapper[4574]: I1004 04:51:38.986849 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2"} err="failed to get container status \"490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2\": rpc error: code = NotFound desc = could not find container \"490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2\": container with ID starting with 490e9d786d0369bae6febdf5be84db6ee9613bd28f7eda6aced469ab48a247b2 not found: ID does not exist" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.020742 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.170217 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-trusted-ca\") pod \"29aee87b-0598-4b50-9b1a-beacaf6d7275\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.170342 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2xk\" (UniqueName: \"kubernetes.io/projected/29aee87b-0598-4b50-9b1a-beacaf6d7275-kube-api-access-vf2xk\") pod \"29aee87b-0598-4b50-9b1a-beacaf6d7275\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.170397 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-operator-metrics\") pod \"29aee87b-0598-4b50-9b1a-beacaf6d7275\" (UID: \"29aee87b-0598-4b50-9b1a-beacaf6d7275\") " Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.174543 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "29aee87b-0598-4b50-9b1a-beacaf6d7275" (UID: "29aee87b-0598-4b50-9b1a-beacaf6d7275"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.179742 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "29aee87b-0598-4b50-9b1a-beacaf6d7275" (UID: "29aee87b-0598-4b50-9b1a-beacaf6d7275"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.180590 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29aee87b-0598-4b50-9b1a-beacaf6d7275-kube-api-access-vf2xk" (OuterVolumeSpecName: "kube-api-access-vf2xk") pod "29aee87b-0598-4b50-9b1a-beacaf6d7275" (UID: "29aee87b-0598-4b50-9b1a-beacaf6d7275"). InnerVolumeSpecName "kube-api-access-vf2xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.233055 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.273649 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2xk\" (UniqueName: \"kubernetes.io/projected/29aee87b-0598-4b50-9b1a-beacaf6d7275-kube-api-access-vf2xk\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.273681 4574 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.273695 4574 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29aee87b-0598-4b50-9b1a-beacaf6d7275-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.374773 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-catalog-content\") pod \"3548b80e-8db9-4112-a727-6deaf3242864\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.374875 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-utilities\") pod \"3548b80e-8db9-4112-a727-6deaf3242864\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.374912 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfjpg\" (UniqueName: \"kubernetes.io/projected/3548b80e-8db9-4112-a727-6deaf3242864-kube-api-access-nfjpg\") pod \"3548b80e-8db9-4112-a727-6deaf3242864\" (UID: \"3548b80e-8db9-4112-a727-6deaf3242864\") " Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.376423 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-utilities" (OuterVolumeSpecName: "utilities") pod "3548b80e-8db9-4112-a727-6deaf3242864" (UID: "3548b80e-8db9-4112-a727-6deaf3242864"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.380282 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3548b80e-8db9-4112-a727-6deaf3242864-kube-api-access-nfjpg" (OuterVolumeSpecName: "kube-api-access-nfjpg") pod "3548b80e-8db9-4112-a727-6deaf3242864" (UID: "3548b80e-8db9-4112-a727-6deaf3242864"). InnerVolumeSpecName "kube-api-access-nfjpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.432791 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3548b80e-8db9-4112-a727-6deaf3242864" (UID: "3548b80e-8db9-4112-a727-6deaf3242864"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.476115 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.476155 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfjpg\" (UniqueName: \"kubernetes.io/projected/3548b80e-8db9-4112-a727-6deaf3242864-kube-api-access-nfjpg\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.476168 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3548b80e-8db9-4112-a727-6deaf3242864-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.681290 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" event={"ID":"29aee87b-0598-4b50-9b1a-beacaf6d7275","Type":"ContainerDied","Data":"1754df0411b3661e9c15ccb36f7961821200df49e68b4cfd1308aba3b8aca11a"} Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.681325 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-44hzk" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.681357 4574 scope.go:117] "RemoveContainer" containerID="c37f28ff0ee9f99ff699444636f31d08ee1e6134fb1a44d86f3eed97114865bc" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.686466 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frmzn" event={"ID":"3548b80e-8db9-4112-a727-6deaf3242864","Type":"ContainerDied","Data":"555c491eb17f97474a761ca8787f2e9ec43cbc16513cf3cd8f9d6a6823552f22"} Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.686499 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frmzn" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.694519 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" event={"ID":"40f47671-d6bd-402e-8003-3688245aa0ed","Type":"ContainerStarted","Data":"3b18a880d9d4b9e53afc12ef5b4cb8a5c934ec7d3845e5c474fbcf6ac896709c"} Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.697180 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.706987 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.712378 4574 scope.go:117] "RemoveContainer" containerID="4e080c27a2d8f40598f51f55483c2f236a547d01cfea28660a8e8d7837095284" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.714541 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-28wcm" podStartSLOduration=2.714518753 podStartE2EDuration="2.714518753s" podCreationTimestamp="2025-10-04 04:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:51:39.712758417 +0000 UTC m=+325.566901459" watchObservedRunningTime="2025-10-04 04:51:39.714518753 +0000 UTC m=+325.568661785" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.744437 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frmzn"] Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.747674 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-frmzn"] Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.753665 4574 scope.go:117] "RemoveContainer" containerID="9d15215c978488ec724a284316b09efd22e5d07cfd257f7f0263af8c76647ddc" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.789540 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-44hzk"] Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.796582 4574 scope.go:117] "RemoveContainer" containerID="8cda84a031fa0f9c3218d7d7a40ab5d3bb712630b53e1f3a98891bc91dcedfa5" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.801091 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-44hzk"] Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.888902 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9mdwq"] Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889124 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889139 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889155 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889161 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889171 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889178 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889189 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889194 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889205 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889211 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889219 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889226 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="extract-utilities" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889255 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889267 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889279 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889286 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889296 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889303 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889310 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889317 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889327 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889333 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889342 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889347 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: E1004 04:51:39.889357 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889365 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="extract-content" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889454 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a786e2-3629-456c-a861-3e5abcd343a2" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889471 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" containerName="marketplace-operator" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889477 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="3548b80e-8db9-4112-a727-6deaf3242864" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889485 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.889494 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4de8b5-1b24-4ccd-bb4c-5b6bba86e4d6" containerName="registry-server" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.890271 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.894219 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.911783 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mdwq"] Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.988417 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvv2b\" (UniqueName: \"kubernetes.io/projected/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-kube-api-access-vvv2b\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.988461 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-utilities\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:39 crc kubenswrapper[4574]: I1004 04:51:39.988693 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-catalog-content\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.084628 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qqtj9"] Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.085974 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.088955 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.089767 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-catalog-content\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.089921 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvv2b\" (UniqueName: \"kubernetes.io/projected/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-kube-api-access-vvv2b\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.089962 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-utilities\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.090317 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-catalog-content\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.090552 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-utilities\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.111185 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqtj9"] Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.129091 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvv2b\" (UniqueName: \"kubernetes.io/projected/0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a-kube-api-access-vvv2b\") pod \"redhat-marketplace-9mdwq\" (UID: \"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a\") " pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.191486 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e881d007-aeba-48d9-8470-62ff6311df35-utilities\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.191566 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlr2\" (UniqueName: \"kubernetes.io/projected/e881d007-aeba-48d9-8470-62ff6311df35-kube-api-access-rqlr2\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.191596 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e881d007-aeba-48d9-8470-62ff6311df35-catalog-content\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.211668 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.293502 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e881d007-aeba-48d9-8470-62ff6311df35-utilities\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.293919 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlr2\" (UniqueName: \"kubernetes.io/projected/e881d007-aeba-48d9-8470-62ff6311df35-kube-api-access-rqlr2\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.293942 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e881d007-aeba-48d9-8470-62ff6311df35-catalog-content\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.295750 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e881d007-aeba-48d9-8470-62ff6311df35-catalog-content\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.296052 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e881d007-aeba-48d9-8470-62ff6311df35-utilities\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.322316 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlr2\" (UniqueName: \"kubernetes.io/projected/e881d007-aeba-48d9-8470-62ff6311df35-kube-api-access-rqlr2\") pod \"redhat-operators-qqtj9\" (UID: \"e881d007-aeba-48d9-8470-62ff6311df35\") " pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.403246 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.437093 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mdwq"] Oct 04 04:51:40 crc kubenswrapper[4574]: W1004 04:51:40.441780 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8d9eda_f6e8_4f07_9a5c_4162010bfb9a.slice/crio-db0d00d5f9679f6ddc147835f85763795fcd72337cb027cfa5ea1c792ec1a123 WatchSource:0}: Error finding container db0d00d5f9679f6ddc147835f85763795fcd72337cb027cfa5ea1c792ec1a123: Status 404 returned error can't find the container with id db0d00d5f9679f6ddc147835f85763795fcd72337cb027cfa5ea1c792ec1a123 Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.646999 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqtj9"] Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.701138 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqtj9" event={"ID":"e881d007-aeba-48d9-8470-62ff6311df35","Type":"ContainerStarted","Data":"8d5286167798dfcde0c260a1fbb2cf1fed23f517695c79833d2df463aaab31a7"} Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.704523 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mdwq" event={"ID":"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a","Type":"ContainerStarted","Data":"db0d00d5f9679f6ddc147835f85763795fcd72337cb027cfa5ea1c792ec1a123"} Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.743385 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29aee87b-0598-4b50-9b1a-beacaf6d7275" path="/var/lib/kubelet/pods/29aee87b-0598-4b50-9b1a-beacaf6d7275/volumes" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.744127 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3548b80e-8db9-4112-a727-6deaf3242864" path="/var/lib/kubelet/pods/3548b80e-8db9-4112-a727-6deaf3242864/volumes" Oct 04 04:51:40 crc kubenswrapper[4574]: I1004 04:51:40.744995 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7" path="/var/lib/kubelet/pods/4ea7d0ff-1f3f-469a-8600-7393bb8ec4c7/volumes" Oct 04 04:51:41 crc kubenswrapper[4574]: I1004 04:51:41.711752 4574 generic.go:334] "Generic (PLEG): container finished" podID="0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a" containerID="e5b24aa7ce45d122f5736403795f3c3c1f54f8e358af334306225f002270532d" exitCode=0 Oct 04 04:51:41 crc kubenswrapper[4574]: I1004 04:51:41.711879 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mdwq" event={"ID":"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a","Type":"ContainerDied","Data":"e5b24aa7ce45d122f5736403795f3c3c1f54f8e358af334306225f002270532d"} Oct 04 04:51:41 crc kubenswrapper[4574]: I1004 04:51:41.714910 4574 generic.go:334] "Generic (PLEG): container finished" podID="e881d007-aeba-48d9-8470-62ff6311df35" containerID="d2d8b4cabef9ba290c4c24c08bc69a8a8c70f57ce961e3fb98fac88e61990b9a" exitCode=0 Oct 04 04:51:41 crc kubenswrapper[4574]: I1004 04:51:41.715421 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqtj9" event={"ID":"e881d007-aeba-48d9-8470-62ff6311df35","Type":"ContainerDied","Data":"d2d8b4cabef9ba290c4c24c08bc69a8a8c70f57ce961e3fb98fac88e61990b9a"} Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.292691 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tfrp6"] Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.296429 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfrp6"] Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.296532 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.300904 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.423846 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjh78\" (UniqueName: \"kubernetes.io/projected/ba61d575-a013-4481-b936-66c5f531f238-kube-api-access-fjh78\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.423952 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba61d575-a013-4481-b936-66c5f531f238-utilities\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.423996 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba61d575-a013-4481-b936-66c5f531f238-catalog-content\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.492253 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bj2qb"] Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.493522 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.495779 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.501137 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj2qb"] Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.525846 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjh78\" (UniqueName: \"kubernetes.io/projected/ba61d575-a013-4481-b936-66c5f531f238-kube-api-access-fjh78\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.525935 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba61d575-a013-4481-b936-66c5f531f238-utilities\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.525993 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba61d575-a013-4481-b936-66c5f531f238-catalog-content\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.526595 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba61d575-a013-4481-b936-66c5f531f238-utilities\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.526649 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba61d575-a013-4481-b936-66c5f531f238-catalog-content\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.549371 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjh78\" (UniqueName: \"kubernetes.io/projected/ba61d575-a013-4481-b936-66c5f531f238-kube-api-access-fjh78\") pod \"certified-operators-tfrp6\" (UID: \"ba61d575-a013-4481-b936-66c5f531f238\") " pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.615637 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.627458 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vchqc\" (UniqueName: \"kubernetes.io/projected/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-kube-api-access-vchqc\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.627522 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-utilities\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.627609 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-catalog-content\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.729225 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vchqc\" (UniqueName: \"kubernetes.io/projected/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-kube-api-access-vchqc\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.729738 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-utilities\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.729809 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-catalog-content\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.730631 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-catalog-content\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.731012 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-utilities\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.751920 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vchqc\" (UniqueName: \"kubernetes.io/projected/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-kube-api-access-vchqc\") pod \"community-operators-bj2qb\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.818336 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:42 crc kubenswrapper[4574]: I1004 04:51:42.852674 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfrp6"] Oct 04 04:51:42 crc kubenswrapper[4574]: W1004 04:51:42.860276 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba61d575_a013_4481_b936_66c5f531f238.slice/crio-b2743e9b7182a92879ffca7f39605d3760c18615e739060d8c84b9fe604d1aba WatchSource:0}: Error finding container b2743e9b7182a92879ffca7f39605d3760c18615e739060d8c84b9fe604d1aba: Status 404 returned error can't find the container with id b2743e9b7182a92879ffca7f39605d3760c18615e739060d8c84b9fe604d1aba Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.051136 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj2qb"] Oct 04 04:51:43 crc kubenswrapper[4574]: W1004 04:51:43.055773 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75531a00_f8c5_4f9d_b7e6_b576ab9bd903.slice/crio-321de385d0fc95c3ea543dc0db562ec86dfe2cac90344725fe27feb8fc665177 WatchSource:0}: Error finding container 321de385d0fc95c3ea543dc0db562ec86dfe2cac90344725fe27feb8fc665177: Status 404 returned error can't find the container with id 321de385d0fc95c3ea543dc0db562ec86dfe2cac90344725fe27feb8fc665177 Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.728732 4574 generic.go:334] "Generic (PLEG): container finished" podID="ba61d575-a013-4481-b936-66c5f531f238" containerID="27fa19877ee705331517d0a721d91c37a08599b56d98dd5b6cd746a013929f30" exitCode=0 Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.728854 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfrp6" event={"ID":"ba61d575-a013-4481-b936-66c5f531f238","Type":"ContainerDied","Data":"27fa19877ee705331517d0a721d91c37a08599b56d98dd5b6cd746a013929f30"} Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.729252 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfrp6" event={"ID":"ba61d575-a013-4481-b936-66c5f531f238","Type":"ContainerStarted","Data":"b2743e9b7182a92879ffca7f39605d3760c18615e739060d8c84b9fe604d1aba"} Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.731252 4574 generic.go:334] "Generic (PLEG): container finished" podID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerID="bbcc2d3cd9f91e0d628d22c1f152a3de9c22b8fe3cc9d5b201855c7f7954f0ed" exitCode=0 Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.731318 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2qb" event={"ID":"75531a00-f8c5-4f9d-b7e6-b576ab9bd903","Type":"ContainerDied","Data":"bbcc2d3cd9f91e0d628d22c1f152a3de9c22b8fe3cc9d5b201855c7f7954f0ed"} Oct 04 04:51:43 crc kubenswrapper[4574]: I1004 04:51:43.731394 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2qb" event={"ID":"75531a00-f8c5-4f9d-b7e6-b576ab9bd903","Type":"ContainerStarted","Data":"321de385d0fc95c3ea543dc0db562ec86dfe2cac90344725fe27feb8fc665177"} Oct 04 04:51:44 crc kubenswrapper[4574]: I1004 04:51:44.743942 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfrp6" event={"ID":"ba61d575-a013-4481-b936-66c5f531f238","Type":"ContainerStarted","Data":"037de2b5f396592a4205e04414e962886202b3687a306aefe54cadef4aa345a3"} Oct 04 04:51:44 crc kubenswrapper[4574]: I1004 04:51:44.751750 4574 generic.go:334] "Generic (PLEG): container finished" podID="0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a" containerID="47e2d50ab632a55dd141ea0e966239f1078e3fedae8390fafcb6bf489c7f4503" exitCode=0 Oct 04 04:51:44 crc kubenswrapper[4574]: I1004 04:51:44.751857 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mdwq" event={"ID":"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a","Type":"ContainerDied","Data":"47e2d50ab632a55dd141ea0e966239f1078e3fedae8390fafcb6bf489c7f4503"} Oct 04 04:51:44 crc kubenswrapper[4574]: I1004 04:51:44.758327 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqtj9" event={"ID":"e881d007-aeba-48d9-8470-62ff6311df35","Type":"ContainerStarted","Data":"65aa7aebe682e26b5ed42cc6bfbcd2404cc6e0e2e992ab9b80e4d29606310e4b"} Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.767190 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mdwq" event={"ID":"0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a","Type":"ContainerStarted","Data":"41542cb4b55b5246fd6148bed23e078443f2f56c9b418265fd07ba203fa9efaf"} Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.774691 4574 generic.go:334] "Generic (PLEG): container finished" podID="ba61d575-a013-4481-b936-66c5f531f238" containerID="037de2b5f396592a4205e04414e962886202b3687a306aefe54cadef4aa345a3" exitCode=0 Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.774762 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfrp6" event={"ID":"ba61d575-a013-4481-b936-66c5f531f238","Type":"ContainerDied","Data":"037de2b5f396592a4205e04414e962886202b3687a306aefe54cadef4aa345a3"} Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.782544 4574 generic.go:334] "Generic (PLEG): container finished" podID="e881d007-aeba-48d9-8470-62ff6311df35" containerID="65aa7aebe682e26b5ed42cc6bfbcd2404cc6e0e2e992ab9b80e4d29606310e4b" exitCode=0 Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.782753 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqtj9" event={"ID":"e881d007-aeba-48d9-8470-62ff6311df35","Type":"ContainerDied","Data":"65aa7aebe682e26b5ed42cc6bfbcd2404cc6e0e2e992ab9b80e4d29606310e4b"} Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.787156 4574 generic.go:334] "Generic (PLEG): container finished" podID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerID="676104c9487a7432b1109bacc391a51eda5a146ada91f892dcdb976530c121d3" exitCode=0 Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.787210 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2qb" event={"ID":"75531a00-f8c5-4f9d-b7e6-b576ab9bd903","Type":"ContainerDied","Data":"676104c9487a7432b1109bacc391a51eda5a146ada91f892dcdb976530c121d3"} Oct 04 04:51:45 crc kubenswrapper[4574]: I1004 04:51:45.808158 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9mdwq" podStartSLOduration=3.26214028 podStartE2EDuration="6.808135996s" podCreationTimestamp="2025-10-04 04:51:39 +0000 UTC" firstStartedPulling="2025-10-04 04:51:41.713820322 +0000 UTC m=+327.567963364" lastFinishedPulling="2025-10-04 04:51:45.259816038 +0000 UTC m=+331.113959080" observedRunningTime="2025-10-04 04:51:45.80042485 +0000 UTC m=+331.654567892" watchObservedRunningTime="2025-10-04 04:51:45.808135996 +0000 UTC m=+331.662279038" Oct 04 04:51:46 crc kubenswrapper[4574]: I1004 04:51:46.798460 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfrp6" event={"ID":"ba61d575-a013-4481-b936-66c5f531f238","Type":"ContainerStarted","Data":"2453c883fb6b855de1f4beb0badb93e2ea225b36d2ed68e7ddf64a40ce40fb45"} Oct 04 04:51:46 crc kubenswrapper[4574]: I1004 04:51:46.801622 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqtj9" event={"ID":"e881d007-aeba-48d9-8470-62ff6311df35","Type":"ContainerStarted","Data":"5a8840d3b53087c8dc32bf11145b956bb3646f321c51004047cfb32c7b3347e3"} Oct 04 04:51:46 crc kubenswrapper[4574]: I1004 04:51:46.805389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2qb" event={"ID":"75531a00-f8c5-4f9d-b7e6-b576ab9bd903","Type":"ContainerStarted","Data":"8299160357dd53ee1cf671632ee0adf17c100fa63f01799f6b532b93d2648fb4"} Oct 04 04:51:46 crc kubenswrapper[4574]: I1004 04:51:46.818024 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tfrp6" podStartSLOduration=1.989604467 podStartE2EDuration="4.817999395s" podCreationTimestamp="2025-10-04 04:51:42 +0000 UTC" firstStartedPulling="2025-10-04 04:51:43.76661745 +0000 UTC m=+329.620760492" lastFinishedPulling="2025-10-04 04:51:46.595012378 +0000 UTC m=+332.449155420" observedRunningTime="2025-10-04 04:51:46.817961744 +0000 UTC m=+332.672104786" watchObservedRunningTime="2025-10-04 04:51:46.817999395 +0000 UTC m=+332.672142437" Oct 04 04:51:46 crc kubenswrapper[4574]: I1004 04:51:46.847665 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qqtj9" podStartSLOduration=2.373560871 podStartE2EDuration="6.847638015s" podCreationTimestamp="2025-10-04 04:51:40 +0000 UTC" firstStartedPulling="2025-10-04 04:51:41.718141509 +0000 UTC m=+327.572284551" lastFinishedPulling="2025-10-04 04:51:46.192218653 +0000 UTC m=+332.046361695" observedRunningTime="2025-10-04 04:51:46.844152093 +0000 UTC m=+332.698295155" watchObservedRunningTime="2025-10-04 04:51:46.847638015 +0000 UTC m=+332.701781057" Oct 04 04:51:46 crc kubenswrapper[4574]: I1004 04:51:46.879758 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bj2qb" podStartSLOduration=2.404504218 podStartE2EDuration="4.879730197s" podCreationTimestamp="2025-10-04 04:51:42 +0000 UTC" firstStartedPulling="2025-10-04 04:51:43.76659696 +0000 UTC m=+329.620740002" lastFinishedPulling="2025-10-04 04:51:46.241822949 +0000 UTC m=+332.095965981" observedRunningTime="2025-10-04 04:51:46.874492064 +0000 UTC m=+332.728635126" watchObservedRunningTime="2025-10-04 04:51:46.879730197 +0000 UTC m=+332.733873239" Oct 04 04:51:50 crc kubenswrapper[4574]: I1004 04:51:50.216480 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:50 crc kubenswrapper[4574]: I1004 04:51:50.217061 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:50 crc kubenswrapper[4574]: I1004 04:51:50.275700 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:50 crc kubenswrapper[4574]: I1004 04:51:50.404193 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:50 crc kubenswrapper[4574]: I1004 04:51:50.404372 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:51:50 crc kubenswrapper[4574]: I1004 04:51:50.867576 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9mdwq" Oct 04 04:51:51 crc kubenswrapper[4574]: I1004 04:51:51.445746 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qqtj9" podUID="e881d007-aeba-48d9-8470-62ff6311df35" containerName="registry-server" probeResult="failure" output=< Oct 04 04:51:51 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 04:51:51 crc kubenswrapper[4574]: > Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.615956 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.616038 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.662572 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.818855 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.818917 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.877057 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.904152 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tfrp6" Oct 04 04:51:52 crc kubenswrapper[4574]: I1004 04:51:52.954600 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 04:52:00 crc kubenswrapper[4574]: I1004 04:52:00.451852 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:52:00 crc kubenswrapper[4574]: I1004 04:52:00.497060 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qqtj9" Oct 04 04:52:19 crc kubenswrapper[4574]: I1004 04:52:19.405111 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:52:19 crc kubenswrapper[4574]: I1004 04:52:19.406042 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:52:49 crc kubenswrapper[4574]: I1004 04:52:49.404890 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:52:49 crc kubenswrapper[4574]: I1004 04:52:49.405614 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:53:19 crc kubenswrapper[4574]: I1004 04:53:19.404464 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:53:19 crc kubenswrapper[4574]: I1004 04:53:19.405430 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:53:19 crc kubenswrapper[4574]: I1004 04:53:19.405497 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:53:19 crc kubenswrapper[4574]: I1004 04:53:19.406279 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0fb3a0b8021a1604e8449f303902c5fed8c9c286e729c6c4d14a3c41e72b327"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 04:53:19 crc kubenswrapper[4574]: I1004 04:53:19.406344 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://b0fb3a0b8021a1604e8449f303902c5fed8c9c286e729c6c4d14a3c41e72b327" gracePeriod=600 Oct 04 04:53:20 crc kubenswrapper[4574]: I1004 04:53:20.372549 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="b0fb3a0b8021a1604e8449f303902c5fed8c9c286e729c6c4d14a3c41e72b327" exitCode=0 Oct 04 04:53:20 crc kubenswrapper[4574]: I1004 04:53:20.372664 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"b0fb3a0b8021a1604e8449f303902c5fed8c9c286e729c6c4d14a3c41e72b327"} Oct 04 04:53:20 crc kubenswrapper[4574]: I1004 04:53:20.373101 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"bb670e871f07036b92f573b0b8c75028cb0c737ed5c76f73792c850b52dbde9a"} Oct 04 04:53:20 crc kubenswrapper[4574]: I1004 04:53:20.373126 4574 scope.go:117] "RemoveContainer" containerID="31714d129d030d05bc48d6fbdf031f5e04ff001e2aba61bec551b90384e6cb75" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.544574 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2jkhv"] Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.545744 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.570627 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2jkhv"] Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715656 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715719 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-registry-tls\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715769 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715792 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-bound-sa-token\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715821 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd67f\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-kube-api-access-sd67f\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715837 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-trusted-ca\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715860 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-registry-certificates\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.715887 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.743950 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817116 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-bound-sa-token\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817203 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd67f\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-kube-api-access-sd67f\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817268 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-trusted-ca\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817303 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-registry-certificates\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817339 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817363 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-registry-tls\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817410 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.817916 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.819291 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-trusted-ca\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.819517 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-registry-certificates\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.824256 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.824760 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-registry-tls\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.834653 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-bound-sa-token\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.835181 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd67f\" (UniqueName: \"kubernetes.io/projected/2f468dca-fac4-4ad3-adcd-0974dbd8d23a-kube-api-access-sd67f\") pod \"image-registry-66df7c8f76-2jkhv\" (UID: \"2f468dca-fac4-4ad3-adcd-0974dbd8d23a\") " pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:30 crc kubenswrapper[4574]: I1004 04:54:30.861881 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:31 crc kubenswrapper[4574]: I1004 04:54:31.045291 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2jkhv"] Oct 04 04:54:31 crc kubenswrapper[4574]: I1004 04:54:31.765684 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" event={"ID":"2f468dca-fac4-4ad3-adcd-0974dbd8d23a","Type":"ContainerStarted","Data":"13865fe7e3900ad30034c9b694dd4d1db66930b22a54495686832cef349a25b9"} Oct 04 04:54:31 crc kubenswrapper[4574]: I1004 04:54:31.765734 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" event={"ID":"2f468dca-fac4-4ad3-adcd-0974dbd8d23a","Type":"ContainerStarted","Data":"bd6cbecd3e8ea54835e08abba8cad80c33ea51601192e47b9f83c2d34d2ca770"} Oct 04 04:54:31 crc kubenswrapper[4574]: I1004 04:54:31.765838 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:31 crc kubenswrapper[4574]: I1004 04:54:31.788459 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" podStartSLOduration=1.7884351550000002 podStartE2EDuration="1.788435155s" podCreationTimestamp="2025-10-04 04:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:54:31.786480073 +0000 UTC m=+497.640623105" watchObservedRunningTime="2025-10-04 04:54:31.788435155 +0000 UTC m=+497.642578197" Oct 04 04:54:50 crc kubenswrapper[4574]: I1004 04:54:50.871051 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2jkhv" Oct 04 04:54:50 crc kubenswrapper[4574]: I1004 04:54:50.923163 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qbwcp"] Oct 04 04:55:15 crc kubenswrapper[4574]: I1004 04:55:15.964647 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" podUID="b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" containerName="registry" containerID="cri-o://c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470" gracePeriod=30 Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.357909 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.412826 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-trusted-ca\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.412923 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-ca-trust-extracted\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.413010 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-certificates\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.413057 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-tls\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.414153 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.414528 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.414592 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.414691 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8tgg\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-kube-api-access-r8tgg\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.414737 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-installation-pull-secrets\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.414772 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-bound-sa-token\") pod \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\" (UID: \"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe\") " Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.415346 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.415368 4574 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.422033 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-kube-api-access-r8tgg" (OuterVolumeSpecName: "kube-api-access-r8tgg") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "kube-api-access-r8tgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.422943 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.430562 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.431050 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.434768 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.435544 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" (UID: "b0c00b7b-e35f-4fc1-ba0b-ff2315694afe"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.517014 4574 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.517093 4574 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.517103 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8tgg\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-kube-api-access-r8tgg\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.517118 4574 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:16 crc kubenswrapper[4574]: I1004 04:55:16.517127 4574 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.034190 4574 generic.go:334] "Generic (PLEG): container finished" podID="b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" containerID="c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470" exitCode=0 Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.034290 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" event={"ID":"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe","Type":"ContainerDied","Data":"c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470"} Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.034380 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" event={"ID":"b0c00b7b-e35f-4fc1-ba0b-ff2315694afe","Type":"ContainerDied","Data":"7d651d632ca0c71d5144d5fe834db2128d7234539a3e77fcc8c27d793503be36"} Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.034410 4574 scope.go:117] "RemoveContainer" containerID="c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470" Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.034321 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qbwcp" Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.059226 4574 scope.go:117] "RemoveContainer" containerID="c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470" Oct 04 04:55:17 crc kubenswrapper[4574]: E1004 04:55:17.060251 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470\": container with ID starting with c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470 not found: ID does not exist" containerID="c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470" Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.060288 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470"} err="failed to get container status \"c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470\": rpc error: code = NotFound desc = could not find container \"c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470\": container with ID starting with c5ba7518b3eac8f1ce57ee5cbba544ada4473077ff4f15bbcbf476869a9cb470 not found: ID does not exist" Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.065888 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qbwcp"] Oct 04 04:55:17 crc kubenswrapper[4574]: I1004 04:55:17.065985 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qbwcp"] Oct 04 04:55:18 crc kubenswrapper[4574]: I1004 04:55:18.752621 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" path="/var/lib/kubelet/pods/b0c00b7b-e35f-4fc1-ba0b-ff2315694afe/volumes" Oct 04 04:55:19 crc kubenswrapper[4574]: I1004 04:55:19.405786 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:55:19 crc kubenswrapper[4574]: I1004 04:55:19.405866 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:55:49 crc kubenswrapper[4574]: I1004 04:55:49.404387 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:55:49 crc kubenswrapper[4574]: I1004 04:55:49.406146 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:56:19 crc kubenswrapper[4574]: I1004 04:56:19.405648 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:56:19 crc kubenswrapper[4574]: I1004 04:56:19.406283 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:56:19 crc kubenswrapper[4574]: I1004 04:56:19.406342 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:56:19 crc kubenswrapper[4574]: I1004 04:56:19.407094 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb670e871f07036b92f573b0b8c75028cb0c737ed5c76f73792c850b52dbde9a"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 04:56:19 crc kubenswrapper[4574]: I1004 04:56:19.407169 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://bb670e871f07036b92f573b0b8c75028cb0c737ed5c76f73792c850b52dbde9a" gracePeriod=600 Oct 04 04:56:20 crc kubenswrapper[4574]: I1004 04:56:20.401003 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="bb670e871f07036b92f573b0b8c75028cb0c737ed5c76f73792c850b52dbde9a" exitCode=0 Oct 04 04:56:20 crc kubenswrapper[4574]: I1004 04:56:20.401066 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"bb670e871f07036b92f573b0b8c75028cb0c737ed5c76f73792c850b52dbde9a"} Oct 04 04:56:20 crc kubenswrapper[4574]: I1004 04:56:20.401523 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"6a0b072b2db63c5fef6028adb7e7cc7f770356e62fc2cc2752bf99549d02a71e"} Oct 04 04:56:20 crc kubenswrapper[4574]: I1004 04:56:20.401542 4574 scope.go:117] "RemoveContainer" containerID="b0fb3a0b8021a1604e8449f303902c5fed8c9c286e729c6c4d14a3c41e72b327" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.113108 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-f7cbs"] Oct 04 04:57:34 crc kubenswrapper[4574]: E1004 04:57:34.113952 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" containerName="registry" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.113967 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" containerName="registry" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.114050 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c00b7b-e35f-4fc1-ba0b-ff2315694afe" containerName="registry" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.114541 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.118556 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qgjh7"] Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.118875 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.118891 4574 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-f4xf6" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.118902 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.119872 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qgjh7" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.128800 4574 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6tjns" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.137426 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-f7cbs"] Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.147055 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qgjh7"] Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.162010 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mfxk5"] Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.162723 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.165758 4574 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qldq4" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.194622 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mfxk5"] Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.248961 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnkxx\" (UniqueName: \"kubernetes.io/projected/25a7bfba-1bab-42d6-bb47-827aeeeefdbc-kube-api-access-lnkxx\") pod \"cert-manager-cainjector-7f985d654d-f7cbs\" (UID: \"25a7bfba-1bab-42d6-bb47-827aeeeefdbc\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.249032 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7klhv\" (UniqueName: \"kubernetes.io/projected/cd556473-f56f-419c-b1b9-3a59dca5f00f-kube-api-access-7klhv\") pod \"cert-manager-webhook-5655c58dd6-mfxk5\" (UID: \"cd556473-f56f-419c-b1b9-3a59dca5f00f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.249141 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkbm\" (UniqueName: \"kubernetes.io/projected/58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e-kube-api-access-nlkbm\") pod \"cert-manager-5b446d88c5-qgjh7\" (UID: \"58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e\") " pod="cert-manager/cert-manager-5b446d88c5-qgjh7" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.350981 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnkxx\" (UniqueName: \"kubernetes.io/projected/25a7bfba-1bab-42d6-bb47-827aeeeefdbc-kube-api-access-lnkxx\") pod \"cert-manager-cainjector-7f985d654d-f7cbs\" (UID: \"25a7bfba-1bab-42d6-bb47-827aeeeefdbc\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.351056 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7klhv\" (UniqueName: \"kubernetes.io/projected/cd556473-f56f-419c-b1b9-3a59dca5f00f-kube-api-access-7klhv\") pod \"cert-manager-webhook-5655c58dd6-mfxk5\" (UID: \"cd556473-f56f-419c-b1b9-3a59dca5f00f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.351092 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkbm\" (UniqueName: \"kubernetes.io/projected/58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e-kube-api-access-nlkbm\") pod \"cert-manager-5b446d88c5-qgjh7\" (UID: \"58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e\") " pod="cert-manager/cert-manager-5b446d88c5-qgjh7" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.375920 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkbm\" (UniqueName: \"kubernetes.io/projected/58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e-kube-api-access-nlkbm\") pod \"cert-manager-5b446d88c5-qgjh7\" (UID: \"58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e\") " pod="cert-manager/cert-manager-5b446d88c5-qgjh7" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.376875 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7klhv\" (UniqueName: \"kubernetes.io/projected/cd556473-f56f-419c-b1b9-3a59dca5f00f-kube-api-access-7klhv\") pod \"cert-manager-webhook-5655c58dd6-mfxk5\" (UID: \"cd556473-f56f-419c-b1b9-3a59dca5f00f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.378932 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnkxx\" (UniqueName: \"kubernetes.io/projected/25a7bfba-1bab-42d6-bb47-827aeeeefdbc-kube-api-access-lnkxx\") pod \"cert-manager-cainjector-7f985d654d-f7cbs\" (UID: \"25a7bfba-1bab-42d6-bb47-827aeeeefdbc\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.438978 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.451657 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qgjh7" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.478005 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.795043 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mfxk5"] Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.799366 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.822717 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" event={"ID":"cd556473-f56f-419c-b1b9-3a59dca5f00f","Type":"ContainerStarted","Data":"dbd650e68a78f0d70b2b2f07e951b53c112852b11e45335ce75ee17ebcfd974e"} Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.908645 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qgjh7"] Oct 04 04:57:34 crc kubenswrapper[4574]: W1004 04:57:34.910845 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58237b74_7f6c_4cd8_b9ba_df68ba8f8c0e.slice/crio-1bb03219de60438bbd9d0df30524cc41749cef007f760e1d59f6a38000753bc2 WatchSource:0}: Error finding container 1bb03219de60438bbd9d0df30524cc41749cef007f760e1d59f6a38000753bc2: Status 404 returned error can't find the container with id 1bb03219de60438bbd9d0df30524cc41749cef007f760e1d59f6a38000753bc2 Oct 04 04:57:34 crc kubenswrapper[4574]: I1004 04:57:34.920043 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-f7cbs"] Oct 04 04:57:35 crc kubenswrapper[4574]: I1004 04:57:35.828491 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" event={"ID":"25a7bfba-1bab-42d6-bb47-827aeeeefdbc","Type":"ContainerStarted","Data":"b2910de27603519c9f246ffbc45414a7112469ecbb84063e5a8e02ec5feee547"} Oct 04 04:57:35 crc kubenswrapper[4574]: I1004 04:57:35.830162 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qgjh7" event={"ID":"58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e","Type":"ContainerStarted","Data":"1bb03219de60438bbd9d0df30524cc41749cef007f760e1d59f6a38000753bc2"} Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.877185 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" event={"ID":"25a7bfba-1bab-42d6-bb47-827aeeeefdbc","Type":"ContainerStarted","Data":"c7057a4164e7c0f4d9c2b3dee70280264757c5219fc112b581ce78029d9b4872"} Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.878683 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" event={"ID":"cd556473-f56f-419c-b1b9-3a59dca5f00f","Type":"ContainerStarted","Data":"2caf013c1fe703d1a001fca3573a69cdb190fbcd9c5e377508ef01009eabfe83"} Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.878745 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.879866 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qgjh7" event={"ID":"58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e","Type":"ContainerStarted","Data":"d5ac24e8af975fcdda3c1e000022517c7df784b0e2bd9db85240670ef187aed5"} Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.920439 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-f7cbs" podStartSLOduration=1.41984122 podStartE2EDuration="7.920418423s" podCreationTimestamp="2025-10-04 04:57:34 +0000 UTC" firstStartedPulling="2025-10-04 04:57:34.933599547 +0000 UTC m=+680.787742589" lastFinishedPulling="2025-10-04 04:57:41.43417675 +0000 UTC m=+687.288319792" observedRunningTime="2025-10-04 04:57:41.89682969 +0000 UTC m=+687.750972732" watchObservedRunningTime="2025-10-04 04:57:41.920418423 +0000 UTC m=+687.774561465" Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.937731 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-qgjh7" podStartSLOduration=1.421085626 podStartE2EDuration="7.937711434s" podCreationTimestamp="2025-10-04 04:57:34 +0000 UTC" firstStartedPulling="2025-10-04 04:57:34.913443443 +0000 UTC m=+680.767586505" lastFinishedPulling="2025-10-04 04:57:41.430069271 +0000 UTC m=+687.284212313" observedRunningTime="2025-10-04 04:57:41.936707905 +0000 UTC m=+687.790850967" watchObservedRunningTime="2025-10-04 04:57:41.937711434 +0000 UTC m=+687.791854476" Oct 04 04:57:41 crc kubenswrapper[4574]: I1004 04:57:41.938102 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" podStartSLOduration=1.306451746 podStartE2EDuration="7.938095885s" podCreationTimestamp="2025-10-04 04:57:34 +0000 UTC" firstStartedPulling="2025-10-04 04:57:34.799089371 +0000 UTC m=+680.653232413" lastFinishedPulling="2025-10-04 04:57:41.43073351 +0000 UTC m=+687.284876552" observedRunningTime="2025-10-04 04:57:41.922638757 +0000 UTC m=+687.776781799" watchObservedRunningTime="2025-10-04 04:57:41.938095885 +0000 UTC m=+687.792238917" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.388954 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ntdng"] Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.389843 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-controller" containerID="cri-o://59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.389879 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.389971 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="northd" containerID="cri-o://cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.389915 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="nbdb" containerID="cri-o://f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.389994 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-node" containerID="cri-o://291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.390008 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-acl-logging" containerID="cri-o://b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.390324 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="sbdb" containerID="cri-o://653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.445428 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" containerID="cri-o://60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" gracePeriod=30 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.900928 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/2.log" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.901702 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/1.log" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.901846 4574 generic.go:334] "Generic (PLEG): container finished" podID="649982aa-c9c5-41ce-a056-48ad058e9aa5" containerID="71122238bbcb5eb7ac6d2b213c66c02622792750a759fbf70a6697d445b3535f" exitCode=2 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.901988 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerDied","Data":"71122238bbcb5eb7ac6d2b213c66c02622792750a759fbf70a6697d445b3535f"} Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.902086 4574 scope.go:117] "RemoveContainer" containerID="231a954e5442330b920164702e31a7ab9aa5b4a5e012cec100d5d03631ef3707" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.902949 4574 scope.go:117] "RemoveContainer" containerID="71122238bbcb5eb7ac6d2b213c66c02622792750a759fbf70a6697d445b3535f" Oct 04 04:57:44 crc kubenswrapper[4574]: E1004 04:57:44.903255 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6wsfn_openshift-multus(649982aa-c9c5-41ce-a056-48ad058e9aa5)\"" pod="openshift-multus/multus-6wsfn" podUID="649982aa-c9c5-41ce-a056-48ad058e9aa5" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.911820 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovnkube-controller/3.log" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.914917 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovn-acl-logging/0.log" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915363 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovn-controller/0.log" Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915639 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" exitCode=0 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915658 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" exitCode=0 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915666 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" exitCode=0 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915675 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" exitCode=143 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915682 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" exitCode=143 Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915707 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112"} Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915733 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364"} Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915744 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608"} Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915753 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8"} Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.915762 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915"} Oct 04 04:57:44 crc kubenswrapper[4574]: I1004 04:57:44.958688 4574 scope.go:117] "RemoveContainer" containerID="ce4da09985295d9898b7456bf6cdf752a8917b46418b79fdb54ad8e0639921f5" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.091919 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovn-acl-logging/0.log" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.092514 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovn-controller/0.log" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.093122 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.180611 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x8fxh"] Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.180917 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-acl-logging" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.180939 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-acl-logging" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.180953 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kubecfg-setup" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.180963 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kubecfg-setup" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.180973 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="sbdb" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.180983 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="sbdb" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.180993 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="northd" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181001 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="northd" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181017 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-node" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181026 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-node" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181037 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="nbdb" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181045 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="nbdb" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181056 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-ovn-metrics" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181064 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-ovn-metrics" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181077 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181085 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181099 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181107 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181121 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181130 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181141 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181149 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181159 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181167 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: E1004 04:57:45.181179 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181187 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181333 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-node" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181347 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181357 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181369 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181380 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181391 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="northd" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181400 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181410 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="kube-rbac-proxy-ovn-metrics" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181422 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovn-acl-logging" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181432 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="sbdb" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181444 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="nbdb" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.181823 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerName="ovnkube-controller" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.183928 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193031 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-ovn\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193093 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-bin\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193119 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-var-lib-openvswitch\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193158 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-etc-openvswitch\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193194 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovn-node-metrics-cert\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193249 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-node-log\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193280 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-env-overrides\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193298 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-netns\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193328 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-netd\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193353 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-slash\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193389 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-kubelet\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193429 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-log-socket\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193457 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znkbp\" (UniqueName: \"kubernetes.io/projected/e473790c-4fad-4637-9d72-0dd6310b4ae0-kube-api-access-znkbp\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193480 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-systemd-units\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193507 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-ovn-kubernetes\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193542 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-script-lib\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193574 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-openvswitch\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193596 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193631 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-systemd\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193651 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-config\") pod \"e473790c-4fad-4637-9d72-0dd6310b4ae0\" (UID: \"e473790c-4fad-4637-9d72-0dd6310b4ae0\") " Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193322 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193353 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194656 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193403 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193906 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.193940 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-log-socket" (OuterVolumeSpecName: "log-socket") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194101 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194110 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194119 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194356 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194383 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194493 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194800 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.194829 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-node-log" (OuterVolumeSpecName: "node-log") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.195195 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.195320 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-slash" (OuterVolumeSpecName: "host-slash") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.195929 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.202814 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e473790c-4fad-4637-9d72-0dd6310b4ae0-kube-api-access-znkbp" (OuterVolumeSpecName: "kube-api-access-znkbp") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "kube-api-access-znkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.203421 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.224755 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e473790c-4fad-4637-9d72-0dd6310b4ae0" (UID: "e473790c-4fad-4637-9d72-0dd6310b4ae0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.294937 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-systemd-units\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.294998 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-var-lib-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295029 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-run-ovn-kubernetes\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295076 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cl9\" (UniqueName: \"kubernetes.io/projected/dda72c49-d80d-4dd6-8930-96d28e558b93-kube-api-access-87cl9\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295105 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-cni-netd\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295137 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-ovnkube-script-lib\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295159 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-slash\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295188 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-node-log\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295220 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295274 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dda72c49-d80d-4dd6-8930-96d28e558b93-ovn-node-metrics-cert\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295296 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-kubelet\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295322 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-env-overrides\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295349 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295381 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-log-socket\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295406 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-etc-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295429 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-ovn\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295447 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-cni-bin\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295467 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-systemd\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295484 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-ovnkube-config\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295511 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-run-netns\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295571 4574 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295584 4574 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-slash\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295595 4574 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295604 4574 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-log-socket\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295614 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znkbp\" (UniqueName: \"kubernetes.io/projected/e473790c-4fad-4637-9d72-0dd6310b4ae0-kube-api-access-znkbp\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295629 4574 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295638 4574 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295647 4574 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295655 4574 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295665 4574 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295675 4574 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295685 4574 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295694 4574 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295703 4574 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295712 4574 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295720 4574 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295728 4574 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473790c-4fad-4637-9d72-0dd6310b4ae0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295736 4574 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-node-log\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295744 4574 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473790c-4fad-4637-9d72-0dd6310b4ae0-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.295752 4574 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473790c-4fad-4637-9d72-0dd6310b4ae0-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397622 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-ovnkube-script-lib\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397709 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-slash\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397766 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-node-log\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397784 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-slash\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397803 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397845 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-node-log\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397860 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dda72c49-d80d-4dd6-8930-96d28e558b93-ovn-node-metrics-cert\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397881 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397887 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-kubelet\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397933 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-env-overrides\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.397957 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398006 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-log-socket\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398028 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-cni-bin\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398069 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-etc-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398092 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-ovn\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398119 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-systemd\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398166 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-ovnkube-config\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398206 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-run-netns\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398270 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-var-lib-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398321 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-systemd-units\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398350 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-run-ovn-kubernetes\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398384 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-kubelet\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398449 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398510 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-log-socket\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398558 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-cni-bin\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398574 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-ovnkube-script-lib\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398609 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-ovn\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398587 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-etc-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398621 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-env-overrides\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398655 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-run-systemd\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398407 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cl9\" (UniqueName: \"kubernetes.io/projected/dda72c49-d80d-4dd6-8930-96d28e558b93-kube-api-access-87cl9\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398651 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-run-netns\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398710 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-systemd-units\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398694 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-run-ovn-kubernetes\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398729 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-var-lib-openvswitch\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398822 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-cni-netd\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.398906 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dda72c49-d80d-4dd6-8930-96d28e558b93-host-cni-netd\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.399154 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dda72c49-d80d-4dd6-8930-96d28e558b93-ovnkube-config\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.401783 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dda72c49-d80d-4dd6-8930-96d28e558b93-ovn-node-metrics-cert\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.419100 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cl9\" (UniqueName: \"kubernetes.io/projected/dda72c49-d80d-4dd6-8930-96d28e558b93-kube-api-access-87cl9\") pod \"ovnkube-node-x8fxh\" (UID: \"dda72c49-d80d-4dd6-8930-96d28e558b93\") " pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.507987 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:45 crc kubenswrapper[4574]: W1004 04:57:45.529570 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddda72c49_d80d_4dd6_8930_96d28e558b93.slice/crio-0b5585ad55d10ea244764e856ea505481b9ac29ebbf9438bd1409058c82c8938 WatchSource:0}: Error finding container 0b5585ad55d10ea244764e856ea505481b9ac29ebbf9438bd1409058c82c8938: Status 404 returned error can't find the container with id 0b5585ad55d10ea244764e856ea505481b9ac29ebbf9438bd1409058c82c8938 Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.924049 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovn-acl-logging/0.log" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.924679 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ntdng_e473790c-4fad-4637-9d72-0dd6310b4ae0/ovn-controller/0.log" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.924971 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" exitCode=0 Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925046 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" exitCode=0 Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925136 4574 generic.go:334] "Generic (PLEG): container finished" podID="e473790c-4fad-4637-9d72-0dd6310b4ae0" containerID="cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" exitCode=0 Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925112 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2"} Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925338 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4"} Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925424 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e"} Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925113 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925500 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ntdng" event={"ID":"e473790c-4fad-4637-9d72-0dd6310b4ae0","Type":"ContainerDied","Data":"440c0c35e5a6ea9dfecf337d50ad9e8c4f7c6fa3d2f43e20aff370d355af9bee"} Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.925369 4574 scope.go:117] "RemoveContainer" containerID="60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.926690 4574 generic.go:334] "Generic (PLEG): container finished" podID="dda72c49-d80d-4dd6-8930-96d28e558b93" containerID="67f5a45b8cd419ff79430a35deabe821eff7091453d3ee20db56e224a5f78702" exitCode=0 Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.926732 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerDied","Data":"67f5a45b8cd419ff79430a35deabe821eff7091453d3ee20db56e224a5f78702"} Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.926845 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"0b5585ad55d10ea244764e856ea505481b9ac29ebbf9438bd1409058c82c8938"} Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.928839 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/2.log" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.945506 4574 scope.go:117] "RemoveContainer" containerID="653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" Oct 04 04:57:45 crc kubenswrapper[4574]: I1004 04:57:45.965622 4574 scope.go:117] "RemoveContainer" containerID="f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:45.999736 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ntdng"] Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.008490 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ntdng"] Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.013537 4574 scope.go:117] "RemoveContainer" containerID="cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.030372 4574 scope.go:117] "RemoveContainer" containerID="9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.056744 4574 scope.go:117] "RemoveContainer" containerID="291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.072907 4574 scope.go:117] "RemoveContainer" containerID="b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.093157 4574 scope.go:117] "RemoveContainer" containerID="59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.120555 4574 scope.go:117] "RemoveContainer" containerID="438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.137621 4574 scope.go:117] "RemoveContainer" containerID="60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.138564 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": container with ID starting with 60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112 not found: ID does not exist" containerID="60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.138623 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112"} err="failed to get container status \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": rpc error: code = NotFound desc = could not find container \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": container with ID starting with 60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.138669 4574 scope.go:117] "RemoveContainer" containerID="653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.138992 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": container with ID starting with 653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2 not found: ID does not exist" containerID="653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.139025 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2"} err="failed to get container status \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": rpc error: code = NotFound desc = could not find container \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": container with ID starting with 653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.139042 4574 scope.go:117] "RemoveContainer" containerID="f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.139342 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": container with ID starting with f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4 not found: ID does not exist" containerID="f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.139414 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4"} err="failed to get container status \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": rpc error: code = NotFound desc = could not find container \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": container with ID starting with f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.139448 4574 scope.go:117] "RemoveContainer" containerID="cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.143667 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": container with ID starting with cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e not found: ID does not exist" containerID="cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.143720 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e"} err="failed to get container status \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": rpc error: code = NotFound desc = could not find container \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": container with ID starting with cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.143747 4574 scope.go:117] "RemoveContainer" containerID="9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.144742 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": container with ID starting with 9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364 not found: ID does not exist" containerID="9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.144768 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364"} err="failed to get container status \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": rpc error: code = NotFound desc = could not find container \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": container with ID starting with 9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.144781 4574 scope.go:117] "RemoveContainer" containerID="291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.145170 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": container with ID starting with 291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608 not found: ID does not exist" containerID="291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.145223 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608"} err="failed to get container status \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": rpc error: code = NotFound desc = could not find container \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": container with ID starting with 291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.145278 4574 scope.go:117] "RemoveContainer" containerID="b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.145677 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": container with ID starting with b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8 not found: ID does not exist" containerID="b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.145716 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8"} err="failed to get container status \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": rpc error: code = NotFound desc = could not find container \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": container with ID starting with b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.145731 4574 scope.go:117] "RemoveContainer" containerID="59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.146018 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": container with ID starting with 59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915 not found: ID does not exist" containerID="59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.146039 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915"} err="failed to get container status \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": rpc error: code = NotFound desc = could not find container \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": container with ID starting with 59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.146054 4574 scope.go:117] "RemoveContainer" containerID="438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d" Oct 04 04:57:46 crc kubenswrapper[4574]: E1004 04:57:46.146398 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": container with ID starting with 438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d not found: ID does not exist" containerID="438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.146426 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d"} err="failed to get container status \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": rpc error: code = NotFound desc = could not find container \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": container with ID starting with 438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.146442 4574 scope.go:117] "RemoveContainer" containerID="60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.146671 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112"} err="failed to get container status \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": rpc error: code = NotFound desc = could not find container \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": container with ID starting with 60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.146708 4574 scope.go:117] "RemoveContainer" containerID="653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.147044 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2"} err="failed to get container status \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": rpc error: code = NotFound desc = could not find container \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": container with ID starting with 653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.147084 4574 scope.go:117] "RemoveContainer" containerID="f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.147411 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4"} err="failed to get container status \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": rpc error: code = NotFound desc = could not find container \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": container with ID starting with f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.147433 4574 scope.go:117] "RemoveContainer" containerID="cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.147688 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e"} err="failed to get container status \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": rpc error: code = NotFound desc = could not find container \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": container with ID starting with cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.147715 4574 scope.go:117] "RemoveContainer" containerID="9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.148021 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364"} err="failed to get container status \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": rpc error: code = NotFound desc = could not find container \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": container with ID starting with 9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.148042 4574 scope.go:117] "RemoveContainer" containerID="291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.148362 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608"} err="failed to get container status \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": rpc error: code = NotFound desc = could not find container \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": container with ID starting with 291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.148387 4574 scope.go:117] "RemoveContainer" containerID="b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.148617 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8"} err="failed to get container status \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": rpc error: code = NotFound desc = could not find container \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": container with ID starting with b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.148642 4574 scope.go:117] "RemoveContainer" containerID="59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149033 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915"} err="failed to get container status \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": rpc error: code = NotFound desc = could not find container \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": container with ID starting with 59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149053 4574 scope.go:117] "RemoveContainer" containerID="438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149295 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d"} err="failed to get container status \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": rpc error: code = NotFound desc = could not find container \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": container with ID starting with 438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149319 4574 scope.go:117] "RemoveContainer" containerID="60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149629 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112"} err="failed to get container status \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": rpc error: code = NotFound desc = could not find container \"60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112\": container with ID starting with 60c096c621256436eeea47df7665b494f9edae05fbf02a38445930e6e6c26112 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149658 4574 scope.go:117] "RemoveContainer" containerID="653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149937 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2"} err="failed to get container status \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": rpc error: code = NotFound desc = could not find container \"653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2\": container with ID starting with 653086a046bd275e347b85cdf0e3f34ccb3af35cd51ed8dd5f3e6091909db0a2 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.149959 4574 scope.go:117] "RemoveContainer" containerID="f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.150190 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4"} err="failed to get container status \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": rpc error: code = NotFound desc = could not find container \"f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4\": container with ID starting with f8857f8866d73e21b7c4251c123945b60d0b4381ed357a8f85ed79bf74e12dd4 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.150210 4574 scope.go:117] "RemoveContainer" containerID="cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.150462 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e"} err="failed to get container status \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": rpc error: code = NotFound desc = could not find container \"cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e\": container with ID starting with cbee1d38734ff45e3d942611b25059449f4f2924aa133d4537f02ca8040a278e not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.150587 4574 scope.go:117] "RemoveContainer" containerID="9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.150879 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364"} err="failed to get container status \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": rpc error: code = NotFound desc = could not find container \"9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364\": container with ID starting with 9745ec447fb775784a4b58043d0532308749b4678a263998e0bc7d0e4ea11364 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.150899 4574 scope.go:117] "RemoveContainer" containerID="291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.151184 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608"} err="failed to get container status \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": rpc error: code = NotFound desc = could not find container \"291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608\": container with ID starting with 291c8fed1fc86047a12e689033effabbf59ac4dedfcc181c4cb5b26fb6815608 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.151206 4574 scope.go:117] "RemoveContainer" containerID="b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.151498 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8"} err="failed to get container status \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": rpc error: code = NotFound desc = could not find container \"b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8\": container with ID starting with b659d847a4476d2bd845f259a42cf3c767161aa3d16c5c8895f027ee839662d8 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.151584 4574 scope.go:117] "RemoveContainer" containerID="59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.151913 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915"} err="failed to get container status \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": rpc error: code = NotFound desc = could not find container \"59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915\": container with ID starting with 59dc2349c7713ce7bf92732baa46433eb4c3e54b88d21ccca78f6a440f64b915 not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.151932 4574 scope.go:117] "RemoveContainer" containerID="438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.152167 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d"} err="failed to get container status \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": rpc error: code = NotFound desc = could not find container \"438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d\": container with ID starting with 438ccea4174e391d620ce46a6b51f552ec6c35748f7242ace738fdb58c9cb37d not found: ID does not exist" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.740530 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e473790c-4fad-4637-9d72-0dd6310b4ae0" path="/var/lib/kubelet/pods/e473790c-4fad-4637-9d72-0dd6310b4ae0/volumes" Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.936198 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"cc37e19a393e80aa51c04431f39d56490ba9e0307b5280054c4f25174553b601"} Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.936565 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"c724fdf5d0144c0ccfc3bac4cfbe43a1fcfabff344d7a4a35c56244772e3919e"} Oct 04 04:57:46 crc kubenswrapper[4574]: I1004 04:57:46.936580 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"84f226fb6984d0ddc534f1cfd8fe3ae058e1319d0fd2f4c8dcaa436aa3f18104"} Oct 04 04:57:47 crc kubenswrapper[4574]: I1004 04:57:47.965543 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"7300d9fd6dbbec6a8bb152f07fb31a9c247694cbe6ec37bb307d37e0168a4c8c"} Oct 04 04:57:47 crc kubenswrapper[4574]: I1004 04:57:47.965625 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"a90df3a56e164bc058261ebbe7f78e71dec7a5def3961f9501b514518aa98cd2"} Oct 04 04:57:47 crc kubenswrapper[4574]: I1004 04:57:47.965642 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"5caab27c5acf4e7341fe13ec7947647b5de004025fe462f1663d65287e0918ba"} Oct 04 04:57:49 crc kubenswrapper[4574]: I1004 04:57:49.482026 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-mfxk5" Oct 04 04:57:49 crc kubenswrapper[4574]: I1004 04:57:49.982372 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"01f03844008d76023cabdfbd436220653486d83cc8c6b0cfd8773b2a1c885e0c"} Oct 04 04:57:51 crc kubenswrapper[4574]: I1004 04:57:51.999048 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" event={"ID":"dda72c49-d80d-4dd6-8930-96d28e558b93","Type":"ContainerStarted","Data":"c73003ed7f53dac72db584568a19990308efd12ef4242fb4c4a96331b68be7d9"} Oct 04 04:57:51 crc kubenswrapper[4574]: I1004 04:57:51.999524 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:52 crc kubenswrapper[4574]: I1004 04:57:52.000600 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:52 crc kubenswrapper[4574]: I1004 04:57:52.000812 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:52 crc kubenswrapper[4574]: I1004 04:57:52.030948 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:52 crc kubenswrapper[4574]: I1004 04:57:52.031120 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:57:52 crc kubenswrapper[4574]: I1004 04:57:52.075055 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" podStartSLOduration=7.075039086 podStartE2EDuration="7.075039086s" podCreationTimestamp="2025-10-04 04:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:57:52.04515405 +0000 UTC m=+697.899297102" watchObservedRunningTime="2025-10-04 04:57:52.075039086 +0000 UTC m=+697.929182118" Oct 04 04:57:59 crc kubenswrapper[4574]: I1004 04:57:59.733543 4574 scope.go:117] "RemoveContainer" containerID="71122238bbcb5eb7ac6d2b213c66c02622792750a759fbf70a6697d445b3535f" Oct 04 04:57:59 crc kubenswrapper[4574]: E1004 04:57:59.734461 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6wsfn_openshift-multus(649982aa-c9c5-41ce-a056-48ad058e9aa5)\"" pod="openshift-multus/multus-6wsfn" podUID="649982aa-c9c5-41ce-a056-48ad058e9aa5" Oct 04 04:58:11 crc kubenswrapper[4574]: I1004 04:58:11.733467 4574 scope.go:117] "RemoveContainer" containerID="71122238bbcb5eb7ac6d2b213c66c02622792750a759fbf70a6697d445b3535f" Oct 04 04:58:12 crc kubenswrapper[4574]: I1004 04:58:12.126533 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6wsfn_649982aa-c9c5-41ce-a056-48ad058e9aa5/kube-multus/2.log" Oct 04 04:58:12 crc kubenswrapper[4574]: I1004 04:58:12.127166 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6wsfn" event={"ID":"649982aa-c9c5-41ce-a056-48ad058e9aa5","Type":"ContainerStarted","Data":"da594512a9f6b8f108613152511876ef135ee4fb0f9301922a1b9f0fbfe34039"} Oct 04 04:58:15 crc kubenswrapper[4574]: I1004 04:58:15.531929 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x8fxh" Oct 04 04:58:19 crc kubenswrapper[4574]: I1004 04:58:19.404410 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:58:19 crc kubenswrapper[4574]: I1004 04:58:19.404824 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.457940 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw"] Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.459898 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.462576 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.481680 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw"] Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.582806 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.582858 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.583002 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vlkq\" (UniqueName: \"kubernetes.io/projected/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-kube-api-access-8vlkq\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.684195 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.684279 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.684316 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vlkq\" (UniqueName: \"kubernetes.io/projected/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-kube-api-access-8vlkq\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.685012 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.685370 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.706669 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vlkq\" (UniqueName: \"kubernetes.io/projected/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-kube-api-access-8vlkq\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.780113 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:35 crc kubenswrapper[4574]: I1004 04:58:35.979923 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw"] Oct 04 04:58:36 crc kubenswrapper[4574]: I1004 04:58:36.269933 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" event={"ID":"108ad5dd-cca2-4fcd-9f61-e3337ad0da82","Type":"ContainerStarted","Data":"d7df7411ca81fc23334cc4d9e8c3c1b3c796a525bcc740977be8cf6b20c117b1"} Oct 04 04:58:36 crc kubenswrapper[4574]: I1004 04:58:36.269981 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" event={"ID":"108ad5dd-cca2-4fcd-9f61-e3337ad0da82","Type":"ContainerStarted","Data":"146120b0a7b0d3d8448acb643a5da377b59715c065aaa1fe6033124b432631fd"} Oct 04 04:58:37 crc kubenswrapper[4574]: I1004 04:58:37.278301 4574 generic.go:334] "Generic (PLEG): container finished" podID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerID="d7df7411ca81fc23334cc4d9e8c3c1b3c796a525bcc740977be8cf6b20c117b1" exitCode=0 Oct 04 04:58:37 crc kubenswrapper[4574]: I1004 04:58:37.278354 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" event={"ID":"108ad5dd-cca2-4fcd-9f61-e3337ad0da82","Type":"ContainerDied","Data":"d7df7411ca81fc23334cc4d9e8c3c1b3c796a525bcc740977be8cf6b20c117b1"} Oct 04 04:58:39 crc kubenswrapper[4574]: I1004 04:58:39.292464 4574 generic.go:334] "Generic (PLEG): container finished" podID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerID="1c22e96831d9205bcc520f62b1f1e015178d4c43ce0b8cfa9ce9ecb17fd06181" exitCode=0 Oct 04 04:58:39 crc kubenswrapper[4574]: I1004 04:58:39.292550 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" event={"ID":"108ad5dd-cca2-4fcd-9f61-e3337ad0da82","Type":"ContainerDied","Data":"1c22e96831d9205bcc520f62b1f1e015178d4c43ce0b8cfa9ce9ecb17fd06181"} Oct 04 04:58:40 crc kubenswrapper[4574]: I1004 04:58:40.301169 4574 generic.go:334] "Generic (PLEG): container finished" podID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerID="6dfdf4b58daa16c5be1702895a2ef25f6c9499c218488cf03772a552606bcaea" exitCode=0 Oct 04 04:58:40 crc kubenswrapper[4574]: I1004 04:58:40.301276 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" event={"ID":"108ad5dd-cca2-4fcd-9f61-e3337ad0da82","Type":"ContainerDied","Data":"6dfdf4b58daa16c5be1702895a2ef25f6c9499c218488cf03772a552606bcaea"} Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.528849 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.657306 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vlkq\" (UniqueName: \"kubernetes.io/projected/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-kube-api-access-8vlkq\") pod \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.657685 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-util\") pod \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.657727 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-bundle\") pod \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\" (UID: \"108ad5dd-cca2-4fcd-9f61-e3337ad0da82\") " Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.659124 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-bundle" (OuterVolumeSpecName: "bundle") pod "108ad5dd-cca2-4fcd-9f61-e3337ad0da82" (UID: "108ad5dd-cca2-4fcd-9f61-e3337ad0da82"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.666060 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-kube-api-access-8vlkq" (OuterVolumeSpecName: "kube-api-access-8vlkq") pod "108ad5dd-cca2-4fcd-9f61-e3337ad0da82" (UID: "108ad5dd-cca2-4fcd-9f61-e3337ad0da82"). InnerVolumeSpecName "kube-api-access-8vlkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.716466 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-util" (OuterVolumeSpecName: "util") pod "108ad5dd-cca2-4fcd-9f61-e3337ad0da82" (UID: "108ad5dd-cca2-4fcd-9f61-e3337ad0da82"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.759388 4574 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-util\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.759597 4574 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:41 crc kubenswrapper[4574]: I1004 04:58:41.759692 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vlkq\" (UniqueName: \"kubernetes.io/projected/108ad5dd-cca2-4fcd-9f61-e3337ad0da82-kube-api-access-8vlkq\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:42 crc kubenswrapper[4574]: I1004 04:58:42.313582 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" event={"ID":"108ad5dd-cca2-4fcd-9f61-e3337ad0da82","Type":"ContainerDied","Data":"146120b0a7b0d3d8448acb643a5da377b59715c065aaa1fe6033124b432631fd"} Oct 04 04:58:42 crc kubenswrapper[4574]: I1004 04:58:42.313891 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146120b0a7b0d3d8448acb643a5da377b59715c065aaa1fe6033124b432631fd" Oct 04 04:58:42 crc kubenswrapper[4574]: I1004 04:58:42.313659 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.529150 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp"] Oct 04 04:58:44 crc kubenswrapper[4574]: E1004 04:58:44.529770 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="pull" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.529789 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="pull" Oct 04 04:58:44 crc kubenswrapper[4574]: E1004 04:58:44.529802 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="util" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.529809 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="util" Oct 04 04:58:44 crc kubenswrapper[4574]: E1004 04:58:44.529832 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="extract" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.529842 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="extract" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.530000 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="108ad5dd-cca2-4fcd-9f61-e3337ad0da82" containerName="extract" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.530547 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.540349 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.541000 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ncdtj" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.546222 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.556358 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp"] Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.595395 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlzws\" (UniqueName: \"kubernetes.io/projected/34ee31e2-d15b-4055-9e27-2ce2e9e43c28-kube-api-access-vlzws\") pod \"nmstate-operator-858ddd8f98-cxhlp\" (UID: \"34ee31e2-d15b-4055-9e27-2ce2e9e43c28\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.697083 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlzws\" (UniqueName: \"kubernetes.io/projected/34ee31e2-d15b-4055-9e27-2ce2e9e43c28-kube-api-access-vlzws\") pod \"nmstate-operator-858ddd8f98-cxhlp\" (UID: \"34ee31e2-d15b-4055-9e27-2ce2e9e43c28\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.715626 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlzws\" (UniqueName: \"kubernetes.io/projected/34ee31e2-d15b-4055-9e27-2ce2e9e43c28-kube-api-access-vlzws\") pod \"nmstate-operator-858ddd8f98-cxhlp\" (UID: \"34ee31e2-d15b-4055-9e27-2ce2e9e43c28\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" Oct 04 04:58:44 crc kubenswrapper[4574]: I1004 04:58:44.849225 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" Oct 04 04:58:45 crc kubenswrapper[4574]: I1004 04:58:45.065690 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp"] Oct 04 04:58:45 crc kubenswrapper[4574]: I1004 04:58:45.341653 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" event={"ID":"34ee31e2-d15b-4055-9e27-2ce2e9e43c28","Type":"ContainerStarted","Data":"513be86f1e7bc4599e447b8151458f9ad37d229037cd14d952173c45f14ae455"} Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.207046 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k52jj"] Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.207280 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerName="controller-manager" containerID="cri-o://11f6a7b4a35c83287d7aed50c86cf756f8877b7c462f55a3935ed25c886217ac" gracePeriod=30 Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.301149 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn"] Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.301737 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerName="route-controller-manager" containerID="cri-o://93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f" gracePeriod=30 Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.359814 4574 generic.go:334] "Generic (PLEG): container finished" podID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerID="11f6a7b4a35c83287d7aed50c86cf756f8877b7c462f55a3935ed25c886217ac" exitCode=0 Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.359876 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" event={"ID":"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf","Type":"ContainerDied","Data":"11f6a7b4a35c83287d7aed50c86cf756f8877b7c462f55a3935ed25c886217ac"} Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.675396 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.716038 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.743773 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") pod \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.743828 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlz45\" (UniqueName: \"kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45\") pod \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.743870 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") pod \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.743899 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") pod \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.743967 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") pod \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\" (UID: \"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.744678 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca" (OuterVolumeSpecName: "client-ca") pod "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.745035 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config" (OuterVolumeSpecName: "config") pod "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.745047 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.749724 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45" (OuterVolumeSpecName: "kube-api-access-hlz45") pod "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf"). InnerVolumeSpecName "kube-api-access-hlz45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.750640 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" (UID: "f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845054 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3111436-b5d8-405e-ab14-2fb33bd107c0-serving-cert\") pod \"f3111436-b5d8-405e-ab14-2fb33bd107c0\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845114 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-client-ca\") pod \"f3111436-b5d8-405e-ab14-2fb33bd107c0\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845163 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-config\") pod \"f3111436-b5d8-405e-ab14-2fb33bd107c0\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845209 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-479vz\" (UniqueName: \"kubernetes.io/projected/f3111436-b5d8-405e-ab14-2fb33bd107c0-kube-api-access-479vz\") pod \"f3111436-b5d8-405e-ab14-2fb33bd107c0\" (UID: \"f3111436-b5d8-405e-ab14-2fb33bd107c0\") " Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845510 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845706 4574 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845913 4574 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845931 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.845942 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlz45\" (UniqueName: \"kubernetes.io/projected/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf-kube-api-access-hlz45\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.846052 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "f3111436-b5d8-405e-ab14-2fb33bd107c0" (UID: "f3111436-b5d8-405e-ab14-2fb33bd107c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.846146 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-config" (OuterVolumeSpecName: "config") pod "f3111436-b5d8-405e-ab14-2fb33bd107c0" (UID: "f3111436-b5d8-405e-ab14-2fb33bd107c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.848649 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3111436-b5d8-405e-ab14-2fb33bd107c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f3111436-b5d8-405e-ab14-2fb33bd107c0" (UID: "f3111436-b5d8-405e-ab14-2fb33bd107c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.848664 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3111436-b5d8-405e-ab14-2fb33bd107c0-kube-api-access-479vz" (OuterVolumeSpecName: "kube-api-access-479vz") pod "f3111436-b5d8-405e-ab14-2fb33bd107c0" (UID: "f3111436-b5d8-405e-ab14-2fb33bd107c0"). InnerVolumeSpecName "kube-api-access-479vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.890952 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b8d54f884-7g7k6"] Oct 04 04:58:47 crc kubenswrapper[4574]: E1004 04:58:47.891171 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerName="route-controller-manager" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.891183 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerName="route-controller-manager" Oct 04 04:58:47 crc kubenswrapper[4574]: E1004 04:58:47.891195 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerName="controller-manager" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.891201 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerName="controller-manager" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.891345 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" containerName="controller-manager" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.891366 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerName="route-controller-manager" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.891732 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.907280 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b8d54f884-7g7k6"] Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.947443 4574 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3111436-b5d8-405e-ab14-2fb33bd107c0-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.947495 4574 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-client-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.947505 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3111436-b5d8-405e-ab14-2fb33bd107c0-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.947518 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-479vz\" (UniqueName: \"kubernetes.io/projected/f3111436-b5d8-405e-ab14-2fb33bd107c0-kube-api-access-479vz\") on node \"crc\" DevicePath \"\"" Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.981191 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr"] Oct 04 04:58:47 crc kubenswrapper[4574]: I1004 04:58:47.982091 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.002716 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr"] Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.048844 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-proxy-ca-bundles\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.048887 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33dc0782-0326-44af-8e9e-f6685801f424-client-ca\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.048940 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c1d417-b0af-400e-a477-c93aebc8c47b-serving-cert\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.048959 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33dc0782-0326-44af-8e9e-f6685801f424-serving-cert\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.048980 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cz4h\" (UniqueName: \"kubernetes.io/projected/33dc0782-0326-44af-8e9e-f6685801f424-kube-api-access-2cz4h\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.049000 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dc0782-0326-44af-8e9e-f6685801f424-config\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.049016 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-config\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.049033 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-client-ca\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.049048 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl6pp\" (UniqueName: \"kubernetes.io/projected/f9c1d417-b0af-400e-a477-c93aebc8c47b-kube-api-access-pl6pp\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150043 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c1d417-b0af-400e-a477-c93aebc8c47b-serving-cert\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150103 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33dc0782-0326-44af-8e9e-f6685801f424-serving-cert\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150127 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cz4h\" (UniqueName: \"kubernetes.io/projected/33dc0782-0326-44af-8e9e-f6685801f424-kube-api-access-2cz4h\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150153 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dc0782-0326-44af-8e9e-f6685801f424-config\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150175 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-config\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150194 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-client-ca\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150216 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl6pp\" (UniqueName: \"kubernetes.io/projected/f9c1d417-b0af-400e-a477-c93aebc8c47b-kube-api-access-pl6pp\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150310 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-proxy-ca-bundles\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.150331 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33dc0782-0326-44af-8e9e-f6685801f424-client-ca\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.151335 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33dc0782-0326-44af-8e9e-f6685801f424-client-ca\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.151948 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-client-ca\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.152199 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dc0782-0326-44af-8e9e-f6685801f424-config\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.152689 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-config\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.153184 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9c1d417-b0af-400e-a477-c93aebc8c47b-proxy-ca-bundles\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.155303 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c1d417-b0af-400e-a477-c93aebc8c47b-serving-cert\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.155412 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33dc0782-0326-44af-8e9e-f6685801f424-serving-cert\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.168934 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl6pp\" (UniqueName: \"kubernetes.io/projected/f9c1d417-b0af-400e-a477-c93aebc8c47b-kube-api-access-pl6pp\") pod \"controller-manager-7b8d54f884-7g7k6\" (UID: \"f9c1d417-b0af-400e-a477-c93aebc8c47b\") " pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.169382 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cz4h\" (UniqueName: \"kubernetes.io/projected/33dc0782-0326-44af-8e9e-f6685801f424-kube-api-access-2cz4h\") pod \"route-controller-manager-56c786bfcf-vdknr\" (UID: \"33dc0782-0326-44af-8e9e-f6685801f424\") " pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.206324 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.299008 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.375855 4574 generic.go:334] "Generic (PLEG): container finished" podID="f3111436-b5d8-405e-ab14-2fb33bd107c0" containerID="93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f" exitCode=0 Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.375992 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.376571 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" event={"ID":"f3111436-b5d8-405e-ab14-2fb33bd107c0","Type":"ContainerDied","Data":"93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f"} Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.376635 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn" event={"ID":"f3111436-b5d8-405e-ab14-2fb33bd107c0","Type":"ContainerDied","Data":"5f7489fd399f5898efa5b782990148d54322afa00b2c0445269d4f581292d9b2"} Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.376655 4574 scope.go:117] "RemoveContainer" containerID="93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.386007 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" event={"ID":"34ee31e2-d15b-4055-9e27-2ce2e9e43c28","Type":"ContainerStarted","Data":"23cf4d9c615a7ab5a698cfe26892ce4cc39e648e51e1a8cb443c211e1ebbfe0b"} Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.388884 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" event={"ID":"f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf","Type":"ContainerDied","Data":"3d16730d0de202f53d5a81f72b044b0ce64d02b64a64d0740ff7d72084eddf00"} Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.388951 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k52jj" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.412513 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-cxhlp" podStartSLOduration=2.040752537 podStartE2EDuration="4.412490928s" podCreationTimestamp="2025-10-04 04:58:44 +0000 UTC" firstStartedPulling="2025-10-04 04:58:45.079488886 +0000 UTC m=+750.933631928" lastFinishedPulling="2025-10-04 04:58:47.451227277 +0000 UTC m=+753.305370319" observedRunningTime="2025-10-04 04:58:48.410874401 +0000 UTC m=+754.265017463" watchObservedRunningTime="2025-10-04 04:58:48.412490928 +0000 UTC m=+754.266633990" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.439370 4574 scope.go:117] "RemoveContainer" containerID="93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f" Oct 04 04:58:48 crc kubenswrapper[4574]: E1004 04:58:48.439897 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f\": container with ID starting with 93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f not found: ID does not exist" containerID="93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.439932 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f"} err="failed to get container status \"93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f\": rpc error: code = NotFound desc = could not find container \"93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f\": container with ID starting with 93b8ba0521a05f29d896e7bd1cef47a6e1d5f259349b99c1c60755826f5bb12f not found: ID does not exist" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.439961 4574 scope.go:117] "RemoveContainer" containerID="11f6a7b4a35c83287d7aed50c86cf756f8877b7c462f55a3935ed25c886217ac" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.456869 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b8d54f884-7g7k6"] Oct 04 04:58:48 crc kubenswrapper[4574]: W1004 04:58:48.466768 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c1d417_b0af_400e_a477_c93aebc8c47b.slice/crio-66c102b144a449e0047764bf0642cb38fbd88d1dfd39c48f2cf2fcea1aeacd03 WatchSource:0}: Error finding container 66c102b144a449e0047764bf0642cb38fbd88d1dfd39c48f2cf2fcea1aeacd03: Status 404 returned error can't find the container with id 66c102b144a449e0047764bf0642cb38fbd88d1dfd39c48f2cf2fcea1aeacd03 Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.482547 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k52jj"] Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.485788 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k52jj"] Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.499357 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn"] Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.502825 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n25jn"] Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.601217 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr"] Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.749702 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3111436-b5d8-405e-ab14-2fb33bd107c0" path="/var/lib/kubelet/pods/f3111436-b5d8-405e-ab14-2fb33bd107c0/volumes" Oct 04 04:58:48 crc kubenswrapper[4574]: I1004 04:58:48.750850 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf" path="/var/lib/kubelet/pods/f48fd06d-0ae0-4d8d-9089-37cf4ac50eaf/volumes" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.396044 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" event={"ID":"33dc0782-0326-44af-8e9e-f6685801f424","Type":"ContainerStarted","Data":"5b6eb4ec50fd668341a2e1a357ab3f2fc1aa662df22cdfed147691a0dfc5d628"} Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.396085 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" event={"ID":"33dc0782-0326-44af-8e9e-f6685801f424","Type":"ContainerStarted","Data":"7760d152ebe9b5a2f393563c30a12885b040885eda89b739312e4e159aad0ca9"} Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.397179 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.399132 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" event={"ID":"f9c1d417-b0af-400e-a477-c93aebc8c47b","Type":"ContainerStarted","Data":"444248f28668dbe1dc7bc5c6479bb7722dbdfd701d501f532aec5c8be6e872d2"} Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.399332 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.399411 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" event={"ID":"f9c1d417-b0af-400e-a477-c93aebc8c47b","Type":"ContainerStarted","Data":"66c102b144a449e0047764bf0642cb38fbd88d1dfd39c48f2cf2fcea1aeacd03"} Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.404397 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.404459 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.406387 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.413814 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.463811 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56c786bfcf-vdknr" podStartSLOduration=2.463791557 podStartE2EDuration="2.463791557s" podCreationTimestamp="2025-10-04 04:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:58:49.461878741 +0000 UTC m=+755.316021783" watchObservedRunningTime="2025-10-04 04:58:49.463791557 +0000 UTC m=+755.317934599" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.549501 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b8d54f884-7g7k6" podStartSLOduration=2.549477928 podStartE2EDuration="2.549477928s" podCreationTimestamp="2025-10-04 04:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:58:49.539251092 +0000 UTC m=+755.393394144" watchObservedRunningTime="2025-10-04 04:58:49.549477928 +0000 UTC m=+755.403620970" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.663381 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.664475 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.668042 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.669532 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.671760 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.690611 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.702471 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.760489 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-d95j7"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.762218 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.771907 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w49x\" (UniqueName: \"kubernetes.io/projected/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-kube-api-access-9w49x\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.772249 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.772355 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmjzx\" (UniqueName: \"kubernetes.io/projected/7131c3ab-9443-4308-acef-460450511901-kube-api-access-wmjzx\") pod \"nmstate-metrics-fdff9cb8d-6mt9v\" (UID: \"7131c3ab-9443-4308-acef-460450511901\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.873923 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwwr\" (UniqueName: \"kubernetes.io/projected/88957498-0f2f-4fb7-baca-fc52a6abec78-kube-api-access-jfwwr\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.874316 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-dbus-socket\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.874472 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-nmstate-lock\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.874601 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w49x\" (UniqueName: \"kubernetes.io/projected/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-kube-api-access-9w49x\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.874676 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.874779 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-ovs-socket\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.874887 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmjzx\" (UniqueName: \"kubernetes.io/projected/7131c3ab-9443-4308-acef-460450511901-kube-api-access-wmjzx\") pod \"nmstate-metrics-fdff9cb8d-6mt9v\" (UID: \"7131c3ab-9443-4308-acef-460450511901\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" Oct 04 04:58:49 crc kubenswrapper[4574]: E1004 04:58:49.875617 4574 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 04 04:58:49 crc kubenswrapper[4574]: E1004 04:58:49.876784 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-tls-key-pair podName:77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6 nodeName:}" failed. No retries permitted until 2025-10-04 04:58:50.376761918 +0000 UTC m=+756.230905190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-tls-key-pair") pod "nmstate-webhook-6cdbc54649-p9s5q" (UID: "77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6") : secret "openshift-nmstate-webhook" not found Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.909810 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w49x\" (UniqueName: \"kubernetes.io/projected/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-kube-api-access-9w49x\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.911010 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.912096 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.914378 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.915146 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmjzx\" (UniqueName: \"kubernetes.io/projected/7131c3ab-9443-4308-acef-460450511901-kube-api-access-wmjzx\") pod \"nmstate-metrics-fdff9cb8d-6mt9v\" (UID: \"7131c3ab-9443-4308-acef-460450511901\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.915583 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.915998 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9hmml" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.941621 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89"] Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.978220 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dc0445fe-9646-4248-a71b-c0dfff8b50f2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.978546 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0445fe-9646-4248-a71b-c0dfff8b50f2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.978676 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwwr\" (UniqueName: \"kubernetes.io/projected/88957498-0f2f-4fb7-baca-fc52a6abec78-kube-api-access-jfwwr\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.978808 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8njz9\" (UniqueName: \"kubernetes.io/projected/dc0445fe-9646-4248-a71b-c0dfff8b50f2-kube-api-access-8njz9\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.978931 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-dbus-socket\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.979013 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-nmstate-lock\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.979121 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-ovs-socket\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.979286 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-ovs-socket\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.980280 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-dbus-socket\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:49 crc kubenswrapper[4574]: I1004 04:58:49.980401 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/88957498-0f2f-4fb7-baca-fc52a6abec78-nmstate-lock\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.001081 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.017142 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwwr\" (UniqueName: \"kubernetes.io/projected/88957498-0f2f-4fb7-baca-fc52a6abec78-kube-api-access-jfwwr\") pod \"nmstate-handler-d95j7\" (UID: \"88957498-0f2f-4fb7-baca-fc52a6abec78\") " pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.080970 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dc0445fe-9646-4248-a71b-c0dfff8b50f2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.081294 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0445fe-9646-4248-a71b-c0dfff8b50f2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.081423 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8njz9\" (UniqueName: \"kubernetes.io/projected/dc0445fe-9646-4248-a71b-c0dfff8b50f2-kube-api-access-8njz9\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: E1004 04:58:50.081913 4574 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 04 04:58:50 crc kubenswrapper[4574]: E1004 04:58:50.082049 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc0445fe-9646-4248-a71b-c0dfff8b50f2-plugin-serving-cert podName:dc0445fe-9646-4248-a71b-c0dfff8b50f2 nodeName:}" failed. No retries permitted until 2025-10-04 04:58:50.582028303 +0000 UTC m=+756.436171345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/dc0445fe-9646-4248-a71b-c0dfff8b50f2-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-vlq89" (UID: "dc0445fe-9646-4248-a71b-c0dfff8b50f2") : secret "plugin-serving-cert" not found Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.082348 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dc0445fe-9646-4248-a71b-c0dfff8b50f2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.088265 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.109894 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8njz9\" (UniqueName: \"kubernetes.io/projected/dc0445fe-9646-4248-a71b-c0dfff8b50f2-kube-api-access-8njz9\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.398047 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.403671 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-p9s5q\" (UID: \"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.409842 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d95j7" event={"ID":"88957498-0f2f-4fb7-baca-fc52a6abec78","Type":"ContainerStarted","Data":"b0bdc5c633d7e0ce6aa2285b35cf09c79871b22d8ed4fe832cc6466d245ec51e"} Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.568875 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v"] Oct 04 04:58:50 crc kubenswrapper[4574]: W1004 04:58:50.577729 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7131c3ab_9443_4308_acef_460450511901.slice/crio-47da2e5e3a25bac584038d087aedb8d017bff8c1974c760a484b876a9667b91e WatchSource:0}: Error finding container 47da2e5e3a25bac584038d087aedb8d017bff8c1974c760a484b876a9667b91e: Status 404 returned error can't find the container with id 47da2e5e3a25bac584038d087aedb8d017bff8c1974c760a484b876a9667b91e Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.587318 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.601649 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0445fe-9646-4248-a71b-c0dfff8b50f2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.606965 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc0445fe-9646-4248-a71b-c0dfff8b50f2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vlq89\" (UID: \"dc0445fe-9646-4248-a71b-c0dfff8b50f2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:50 crc kubenswrapper[4574]: I1004 04:58:50.874830 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.017998 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q"] Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.298549 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89"] Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.443993 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" event={"ID":"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6","Type":"ContainerStarted","Data":"656ea772aff7c62188d18980e02aa919044ab00c604d3babff82548325770d9a"} Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.450648 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fc7b6f686-nsfvw"] Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.451896 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" event={"ID":"7131c3ab-9443-4308-acef-460450511901","Type":"ContainerStarted","Data":"47da2e5e3a25bac584038d087aedb8d017bff8c1974c760a484b876a9667b91e"} Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.452315 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.459384 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" event={"ID":"dc0445fe-9646-4248-a71b-c0dfff8b50f2","Type":"ContainerStarted","Data":"b162d4f2bdc969e2264b6a4963ea7161bf316f0bc5075ebabdb6fc1ed96682f6"} Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.466902 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc7b6f686-nsfvw"] Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533173 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-console-config\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533264 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da49673f-54af-41d8-bdc2-8bb83b927510-console-serving-cert\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533336 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-oauth-serving-cert\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533366 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da49673f-54af-41d8-bdc2-8bb83b927510-console-oauth-config\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533470 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk88j\" (UniqueName: \"kubernetes.io/projected/da49673f-54af-41d8-bdc2-8bb83b927510-kube-api-access-vk88j\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533502 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-service-ca\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.533529 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-trusted-ca-bundle\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634660 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-console-config\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634742 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da49673f-54af-41d8-bdc2-8bb83b927510-console-serving-cert\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634796 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-oauth-serving-cert\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634831 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da49673f-54af-41d8-bdc2-8bb83b927510-console-oauth-config\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634871 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk88j\" (UniqueName: \"kubernetes.io/projected/da49673f-54af-41d8-bdc2-8bb83b927510-kube-api-access-vk88j\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634902 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-service-ca\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.634924 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-trusted-ca-bundle\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.636028 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-console-config\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.639549 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-service-ca\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.639582 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-trusted-ca-bundle\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.639607 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da49673f-54af-41d8-bdc2-8bb83b927510-oauth-serving-cert\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.653530 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da49673f-54af-41d8-bdc2-8bb83b927510-console-oauth-config\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.658869 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da49673f-54af-41d8-bdc2-8bb83b927510-console-serving-cert\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.664157 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk88j\" (UniqueName: \"kubernetes.io/projected/da49673f-54af-41d8-bdc2-8bb83b927510-kube-api-access-vk88j\") pod \"console-5fc7b6f686-nsfvw\" (UID: \"da49673f-54af-41d8-bdc2-8bb83b927510\") " pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:51 crc kubenswrapper[4574]: I1004 04:58:51.780902 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:58:52 crc kubenswrapper[4574]: I1004 04:58:52.037109 4574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 04 04:58:52 crc kubenswrapper[4574]: I1004 04:58:52.296649 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc7b6f686-nsfvw"] Oct 04 04:58:52 crc kubenswrapper[4574]: I1004 04:58:52.468802 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc7b6f686-nsfvw" event={"ID":"da49673f-54af-41d8-bdc2-8bb83b927510","Type":"ContainerStarted","Data":"972d7b9a737ac77ade195d947b3f7e0c188601aed5f785a305c27fd5ef450925"} Oct 04 04:58:53 crc kubenswrapper[4574]: I1004 04:58:53.479001 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc7b6f686-nsfvw" event={"ID":"da49673f-54af-41d8-bdc2-8bb83b927510","Type":"ContainerStarted","Data":"dd8472cf365c5a501ad776adcf36fd6a19bd5a4f12840effb74c3bac51805b50"} Oct 04 04:58:53 crc kubenswrapper[4574]: I1004 04:58:53.504038 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fc7b6f686-nsfvw" podStartSLOduration=2.504003432 podStartE2EDuration="2.504003432s" podCreationTimestamp="2025-10-04 04:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 04:58:53.501989633 +0000 UTC m=+759.356132675" watchObservedRunningTime="2025-10-04 04:58:53.504003432 +0000 UTC m=+759.358146474" Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.498514 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" event={"ID":"7131c3ab-9443-4308-acef-460450511901","Type":"ContainerStarted","Data":"78b7e605b4f1d290dd13aed4c6ddb6b2bb878c18c16ee26fc6e08dfcd6b83189"} Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.499995 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" event={"ID":"dc0445fe-9646-4248-a71b-c0dfff8b50f2","Type":"ContainerStarted","Data":"3830a3ee5a64aa5ef5c9de244aa05aefac9d59e6132add6529ae2ee7908f5e25"} Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.502926 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" event={"ID":"77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6","Type":"ContainerStarted","Data":"6a57a5313189860f4088ad8c9d5023df8fa5b700f5b3ebe5b77ebc262c27c40b"} Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.503191 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.505287 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d95j7" event={"ID":"88957498-0f2f-4fb7-baca-fc52a6abec78","Type":"ContainerStarted","Data":"10edeb58b579907aa4b73dac0d679f9d6d05e8402676d508c7469e0c7995f4ca"} Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.505782 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.536387 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vlq89" podStartSLOduration=3.169605403 podStartE2EDuration="6.536351583s" podCreationTimestamp="2025-10-04 04:58:49 +0000 UTC" firstStartedPulling="2025-10-04 04:58:51.314968241 +0000 UTC m=+757.169111273" lastFinishedPulling="2025-10-04 04:58:54.681714391 +0000 UTC m=+760.535857453" observedRunningTime="2025-10-04 04:58:55.521835713 +0000 UTC m=+761.375978755" watchObservedRunningTime="2025-10-04 04:58:55.536351583 +0000 UTC m=+761.390494625" Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.569959 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-d95j7" podStartSLOduration=2.006786796 podStartE2EDuration="6.569941626s" podCreationTimestamp="2025-10-04 04:58:49 +0000 UTC" firstStartedPulling="2025-10-04 04:58:50.125276835 +0000 UTC m=+755.979419907" lastFinishedPulling="2025-10-04 04:58:54.688431695 +0000 UTC m=+760.542574737" observedRunningTime="2025-10-04 04:58:55.569782092 +0000 UTC m=+761.423925134" watchObservedRunningTime="2025-10-04 04:58:55.569941626 +0000 UTC m=+761.424084658" Oct 04 04:58:55 crc kubenswrapper[4574]: I1004 04:58:55.571627 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" podStartSLOduration=2.9136231 podStartE2EDuration="6.571619865s" podCreationTimestamp="2025-10-04 04:58:49 +0000 UTC" firstStartedPulling="2025-10-04 04:58:51.022394867 +0000 UTC m=+756.876537899" lastFinishedPulling="2025-10-04 04:58:54.680391622 +0000 UTC m=+760.534534664" observedRunningTime="2025-10-04 04:58:55.549195375 +0000 UTC m=+761.403338417" watchObservedRunningTime="2025-10-04 04:58:55.571619865 +0000 UTC m=+761.425762907" Oct 04 04:58:57 crc kubenswrapper[4574]: I1004 04:58:57.519848 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" event={"ID":"7131c3ab-9443-4308-acef-460450511901","Type":"ContainerStarted","Data":"57f3971360020f0c3922c8bc14ef05b5342d49933acbc7b0f18110ff7903c694"} Oct 04 04:58:57 crc kubenswrapper[4574]: I1004 04:58:57.546919 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6mt9v" podStartSLOduration=1.972323206 podStartE2EDuration="8.546890093s" podCreationTimestamp="2025-10-04 04:58:49 +0000 UTC" firstStartedPulling="2025-10-04 04:58:50.580444447 +0000 UTC m=+756.434587489" lastFinishedPulling="2025-10-04 04:58:57.155011324 +0000 UTC m=+763.009154376" observedRunningTime="2025-10-04 04:58:57.541841067 +0000 UTC m=+763.395984149" watchObservedRunningTime="2025-10-04 04:58:57.546890093 +0000 UTC m=+763.401033135" Oct 04 04:59:00 crc kubenswrapper[4574]: I1004 04:59:00.116806 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-d95j7" Oct 04 04:59:01 crc kubenswrapper[4574]: I1004 04:59:01.781629 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:59:01 crc kubenswrapper[4574]: I1004 04:59:01.781687 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:59:01 crc kubenswrapper[4574]: I1004 04:59:01.787399 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:59:02 crc kubenswrapper[4574]: I1004 04:59:02.549564 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fc7b6f686-nsfvw" Oct 04 04:59:02 crc kubenswrapper[4574]: I1004 04:59:02.603150 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-l8x2m"] Oct 04 04:59:10 crc kubenswrapper[4574]: I1004 04:59:10.592439 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-p9s5q" Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.405002 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.405614 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.405667 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.406312 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a0b072b2db63c5fef6028adb7e7cc7f770356e62fc2cc2752bf99549d02a71e"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.406368 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://6a0b072b2db63c5fef6028adb7e7cc7f770356e62fc2cc2752bf99549d02a71e" gracePeriod=600 Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.645633 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="6a0b072b2db63c5fef6028adb7e7cc7f770356e62fc2cc2752bf99549d02a71e" exitCode=0 Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.645918 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"6a0b072b2db63c5fef6028adb7e7cc7f770356e62fc2cc2752bf99549d02a71e"} Oct 04 04:59:19 crc kubenswrapper[4574]: I1004 04:59:19.646193 4574 scope.go:117] "RemoveContainer" containerID="bb670e871f07036b92f573b0b8c75028cb0c737ed5c76f73792c850b52dbde9a" Oct 04 04:59:20 crc kubenswrapper[4574]: I1004 04:59:20.655289 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"a99234efe67f037290baa95758d3a1f0d549bea91113058aaa5fd090767eb42e"} Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.313384 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6"] Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.315222 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.317499 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.326762 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6"] Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.497989 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.498069 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.498382 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmv86\" (UniqueName: \"kubernetes.io/projected/8493ffda-5976-4e28-9927-9bc66b26fccf-kube-api-access-kmv86\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.600158 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.600289 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.600448 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmv86\" (UniqueName: \"kubernetes.io/projected/8493ffda-5976-4e28-9927-9bc66b26fccf-kube-api-access-kmv86\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.600786 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.600884 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.619403 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmv86\" (UniqueName: \"kubernetes.io/projected/8493ffda-5976-4e28-9927-9bc66b26fccf-kube-api-access-kmv86\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:23 crc kubenswrapper[4574]: I1004 04:59:23.640646 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:24 crc kubenswrapper[4574]: I1004 04:59:24.056855 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6"] Oct 04 04:59:24 crc kubenswrapper[4574]: W1004 04:59:24.066580 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8493ffda_5976_4e28_9927_9bc66b26fccf.slice/crio-38e45ed7ff84b75f3449524c9354189bdfeb06a98edbe31f3b80566d1216bd13 WatchSource:0}: Error finding container 38e45ed7ff84b75f3449524c9354189bdfeb06a98edbe31f3b80566d1216bd13: Status 404 returned error can't find the container with id 38e45ed7ff84b75f3449524c9354189bdfeb06a98edbe31f3b80566d1216bd13 Oct 04 04:59:24 crc kubenswrapper[4574]: I1004 04:59:24.686224 4574 generic.go:334] "Generic (PLEG): container finished" podID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerID="7146fe945bf1e6b84351d5f50d57d7c3fbe71a2f1ccd865b36908466aa8ebed2" exitCode=0 Oct 04 04:59:24 crc kubenswrapper[4574]: I1004 04:59:24.686654 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" event={"ID":"8493ffda-5976-4e28-9927-9bc66b26fccf","Type":"ContainerDied","Data":"7146fe945bf1e6b84351d5f50d57d7c3fbe71a2f1ccd865b36908466aa8ebed2"} Oct 04 04:59:24 crc kubenswrapper[4574]: I1004 04:59:24.686740 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" event={"ID":"8493ffda-5976-4e28-9927-9bc66b26fccf","Type":"ContainerStarted","Data":"38e45ed7ff84b75f3449524c9354189bdfeb06a98edbe31f3b80566d1216bd13"} Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.618429 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h7xk4"] Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.620046 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.632529 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7xk4"] Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.734373 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-utilities\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.734479 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrs4\" (UniqueName: \"kubernetes.io/projected/316fb3f3-ea57-453f-990a-5b47c87a6f6b-kube-api-access-nzrs4\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.734513 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-catalog-content\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.835761 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrs4\" (UniqueName: \"kubernetes.io/projected/316fb3f3-ea57-453f-990a-5b47c87a6f6b-kube-api-access-nzrs4\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.835828 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-catalog-content\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.835958 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-utilities\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.836432 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-utilities\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.836548 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-catalog-content\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.870801 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrs4\" (UniqueName: \"kubernetes.io/projected/316fb3f3-ea57-453f-990a-5b47c87a6f6b-kube-api-access-nzrs4\") pod \"redhat-operators-h7xk4\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:25 crc kubenswrapper[4574]: I1004 04:59:25.938726 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:26 crc kubenswrapper[4574]: I1004 04:59:26.403829 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7xk4"] Oct 04 04:59:26 crc kubenswrapper[4574]: W1004 04:59:26.413847 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316fb3f3_ea57_453f_990a_5b47c87a6f6b.slice/crio-9a1a34b560926a601d953819efa89e4e63c64b7f194b20dad2e9ba19a82353ca WatchSource:0}: Error finding container 9a1a34b560926a601d953819efa89e4e63c64b7f194b20dad2e9ba19a82353ca: Status 404 returned error can't find the container with id 9a1a34b560926a601d953819efa89e4e63c64b7f194b20dad2e9ba19a82353ca Oct 04 04:59:26 crc kubenswrapper[4574]: I1004 04:59:26.704088 4574 generic.go:334] "Generic (PLEG): container finished" podID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerID="11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2" exitCode=0 Oct 04 04:59:26 crc kubenswrapper[4574]: I1004 04:59:26.704162 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xk4" event={"ID":"316fb3f3-ea57-453f-990a-5b47c87a6f6b","Type":"ContainerDied","Data":"11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2"} Oct 04 04:59:26 crc kubenswrapper[4574]: I1004 04:59:26.704254 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xk4" event={"ID":"316fb3f3-ea57-453f-990a-5b47c87a6f6b","Type":"ContainerStarted","Data":"9a1a34b560926a601d953819efa89e4e63c64b7f194b20dad2e9ba19a82353ca"} Oct 04 04:59:26 crc kubenswrapper[4574]: I1004 04:59:26.707029 4574 generic.go:334] "Generic (PLEG): container finished" podID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerID="2d2bc9951dded0fb77f4ffb0221f6d6a6378c1d1cf4ca5ec72e8a2240d6f0b57" exitCode=0 Oct 04 04:59:26 crc kubenswrapper[4574]: I1004 04:59:26.707077 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" event={"ID":"8493ffda-5976-4e28-9927-9bc66b26fccf","Type":"ContainerDied","Data":"2d2bc9951dded0fb77f4ffb0221f6d6a6378c1d1cf4ca5ec72e8a2240d6f0b57"} Oct 04 04:59:27 crc kubenswrapper[4574]: I1004 04:59:27.650001 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-l8x2m" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerName="console" containerID="cri-o://6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532" gracePeriod=15 Oct 04 04:59:27 crc kubenswrapper[4574]: I1004 04:59:27.715270 4574 generic.go:334] "Generic (PLEG): container finished" podID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerID="101d6ff19eda7ac6002a7ca343dc89b2780a91b36e71e637f767ebae5b60cd22" exitCode=0 Oct 04 04:59:27 crc kubenswrapper[4574]: I1004 04:59:27.715277 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" event={"ID":"8493ffda-5976-4e28-9927-9bc66b26fccf","Type":"ContainerDied","Data":"101d6ff19eda7ac6002a7ca343dc89b2780a91b36e71e637f767ebae5b60cd22"} Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.094746 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-l8x2m_87ef4dec-e273-41a2-96de-6c9cc05122d2/console/0.log" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.095553 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.270825 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-config\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.270889 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxrtq\" (UniqueName: \"kubernetes.io/projected/87ef4dec-e273-41a2-96de-6c9cc05122d2-kube-api-access-nxrtq\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.270958 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-oauth-config\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.271066 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-trusted-ca-bundle\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.271099 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-serving-cert\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.271151 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-oauth-serving-cert\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.271195 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-service-ca\") pod \"87ef4dec-e273-41a2-96de-6c9cc05122d2\" (UID: \"87ef4dec-e273-41a2-96de-6c9cc05122d2\") " Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.272031 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.272101 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-service-ca" (OuterVolumeSpecName: "service-ca") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.272125 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-config" (OuterVolumeSpecName: "console-config") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.272142 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.279508 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.286731 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.287013 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ef4dec-e273-41a2-96de-6c9cc05122d2-kube-api-access-nxrtq" (OuterVolumeSpecName: "kube-api-access-nxrtq") pod "87ef4dec-e273-41a2-96de-6c9cc05122d2" (UID: "87ef4dec-e273-41a2-96de-6c9cc05122d2"). InnerVolumeSpecName "kube-api-access-nxrtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373259 4574 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-service-ca\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373303 4574 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373317 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxrtq\" (UniqueName: \"kubernetes.io/projected/87ef4dec-e273-41a2-96de-6c9cc05122d2-kube-api-access-nxrtq\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373328 4574 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373341 4574 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373351 4574 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/87ef4dec-e273-41a2-96de-6c9cc05122d2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.373362 4574 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/87ef4dec-e273-41a2-96de-6c9cc05122d2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.721750 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-l8x2m_87ef4dec-e273-41a2-96de-6c9cc05122d2/console/0.log" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.722072 4574 generic.go:334] "Generic (PLEG): container finished" podID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerID="6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532" exitCode=2 Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.722181 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l8x2m" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.722384 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l8x2m" event={"ID":"87ef4dec-e273-41a2-96de-6c9cc05122d2","Type":"ContainerDied","Data":"6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532"} Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.722444 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l8x2m" event={"ID":"87ef4dec-e273-41a2-96de-6c9cc05122d2","Type":"ContainerDied","Data":"e30879a049cbb99ef99f028c91c30a0c2c1eb4ec8df3abbb1744e61eb8623dca"} Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.722467 4574 scope.go:117] "RemoveContainer" containerID="6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.724671 4574 generic.go:334] "Generic (PLEG): container finished" podID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerID="1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54" exitCode=0 Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.725984 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xk4" event={"ID":"316fb3f3-ea57-453f-990a-5b47c87a6f6b","Type":"ContainerDied","Data":"1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54"} Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.740599 4574 scope.go:117] "RemoveContainer" containerID="6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532" Oct 04 04:59:28 crc kubenswrapper[4574]: E1004 04:59:28.742817 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532\": container with ID starting with 6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532 not found: ID does not exist" containerID="6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.742862 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532"} err="failed to get container status \"6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532\": rpc error: code = NotFound desc = could not find container \"6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532\": container with ID starting with 6ad6611bdd980e7e3686395a0a68afe6a5cf68dafc6826f5e39fa9c57f72f532 not found: ID does not exist" Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.772304 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-l8x2m"] Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.776801 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-l8x2m"] Oct 04 04:59:28 crc kubenswrapper[4574]: I1004 04:59:28.949614 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.083065 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmv86\" (UniqueName: \"kubernetes.io/projected/8493ffda-5976-4e28-9927-9bc66b26fccf-kube-api-access-kmv86\") pod \"8493ffda-5976-4e28-9927-9bc66b26fccf\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.083122 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-util\") pod \"8493ffda-5976-4e28-9927-9bc66b26fccf\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.083200 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-bundle\") pod \"8493ffda-5976-4e28-9927-9bc66b26fccf\" (UID: \"8493ffda-5976-4e28-9927-9bc66b26fccf\") " Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.084432 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-bundle" (OuterVolumeSpecName: "bundle") pod "8493ffda-5976-4e28-9927-9bc66b26fccf" (UID: "8493ffda-5976-4e28-9927-9bc66b26fccf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.088830 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8493ffda-5976-4e28-9927-9bc66b26fccf-kube-api-access-kmv86" (OuterVolumeSpecName: "kube-api-access-kmv86") pod "8493ffda-5976-4e28-9927-9bc66b26fccf" (UID: "8493ffda-5976-4e28-9927-9bc66b26fccf"). InnerVolumeSpecName "kube-api-access-kmv86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.100578 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-util" (OuterVolumeSpecName: "util") pod "8493ffda-5976-4e28-9927-9bc66b26fccf" (UID: "8493ffda-5976-4e28-9927-9bc66b26fccf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.184701 4574 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.184747 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmv86\" (UniqueName: \"kubernetes.io/projected/8493ffda-5976-4e28-9927-9bc66b26fccf-kube-api-access-kmv86\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.184760 4574 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8493ffda-5976-4e28-9927-9bc66b26fccf-util\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.733500 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" event={"ID":"8493ffda-5976-4e28-9927-9bc66b26fccf","Type":"ContainerDied","Data":"38e45ed7ff84b75f3449524c9354189bdfeb06a98edbe31f3b80566d1216bd13"} Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.733552 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38e45ed7ff84b75f3449524c9354189bdfeb06a98edbe31f3b80566d1216bd13" Oct 04 04:59:29 crc kubenswrapper[4574]: I1004 04:59:29.734654 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6" Oct 04 04:59:30 crc kubenswrapper[4574]: I1004 04:59:30.741195 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" path="/var/lib/kubelet/pods/87ef4dec-e273-41a2-96de-6c9cc05122d2/volumes" Oct 04 04:59:30 crc kubenswrapper[4574]: I1004 04:59:30.743452 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xk4" event={"ID":"316fb3f3-ea57-453f-990a-5b47c87a6f6b","Type":"ContainerStarted","Data":"c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877"} Oct 04 04:59:30 crc kubenswrapper[4574]: I1004 04:59:30.783983 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h7xk4" podStartSLOduration=2.433175791 podStartE2EDuration="5.783959686s" podCreationTimestamp="2025-10-04 04:59:25 +0000 UTC" firstStartedPulling="2025-10-04 04:59:26.705738702 +0000 UTC m=+792.559881754" lastFinishedPulling="2025-10-04 04:59:30.056522607 +0000 UTC m=+795.910665649" observedRunningTime="2025-10-04 04:59:30.780465564 +0000 UTC m=+796.634608616" watchObservedRunningTime="2025-10-04 04:59:30.783959686 +0000 UTC m=+796.638102748" Oct 04 04:59:35 crc kubenswrapper[4574]: I1004 04:59:35.939575 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:35 crc kubenswrapper[4574]: I1004 04:59:35.939931 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:35 crc kubenswrapper[4574]: I1004 04:59:35.982976 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:36 crc kubenswrapper[4574]: I1004 04:59:36.823144 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:37 crc kubenswrapper[4574]: I1004 04:59:37.816629 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7xk4"] Oct 04 04:59:38 crc kubenswrapper[4574]: I1004 04:59:38.790467 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h7xk4" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="registry-server" containerID="cri-o://c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877" gracePeriod=2 Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.807069 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.808721 4574 generic.go:334] "Generic (PLEG): container finished" podID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerID="c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877" exitCode=0 Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.808797 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xk4" event={"ID":"316fb3f3-ea57-453f-990a-5b47c87a6f6b","Type":"ContainerDied","Data":"c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877"} Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.809003 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xk4" event={"ID":"316fb3f3-ea57-453f-990a-5b47c87a6f6b","Type":"ContainerDied","Data":"9a1a34b560926a601d953819efa89e4e63c64b7f194b20dad2e9ba19a82353ca"} Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.809027 4574 scope.go:117] "RemoveContainer" containerID="c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.832617 4574 scope.go:117] "RemoveContainer" containerID="1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.851595 4574 scope.go:117] "RemoveContainer" containerID="11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.878920 4574 scope.go:117] "RemoveContainer" containerID="c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877" Oct 04 04:59:39 crc kubenswrapper[4574]: E1004 04:59:39.879479 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877\": container with ID starting with c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877 not found: ID does not exist" containerID="c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.879512 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877"} err="failed to get container status \"c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877\": rpc error: code = NotFound desc = could not find container \"c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877\": container with ID starting with c7e7ef9e68a11bd33f1231886371b848d9a20844b39cf36262e1cdd90b35d877 not found: ID does not exist" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.879540 4574 scope.go:117] "RemoveContainer" containerID="1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54" Oct 04 04:59:39 crc kubenswrapper[4574]: E1004 04:59:39.879734 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54\": container with ID starting with 1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54 not found: ID does not exist" containerID="1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.879757 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54"} err="failed to get container status \"1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54\": rpc error: code = NotFound desc = could not find container \"1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54\": container with ID starting with 1b1767fd8e52dabe5cfc3f17369fd0c09b70bcbaa79ae428035ea8a16fb55b54 not found: ID does not exist" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.879771 4574 scope.go:117] "RemoveContainer" containerID="11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2" Oct 04 04:59:39 crc kubenswrapper[4574]: E1004 04:59:39.879960 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2\": container with ID starting with 11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2 not found: ID does not exist" containerID="11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.879981 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2"} err="failed to get container status \"11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2\": rpc error: code = NotFound desc = could not find container \"11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2\": container with ID starting with 11cc6dc05f1f80f8f4615aedbf0be945b2f6472fba5a064aa46527acdaf3dab2 not found: ID does not exist" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.928578 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-utilities\") pod \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.928667 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzrs4\" (UniqueName: \"kubernetes.io/projected/316fb3f3-ea57-453f-990a-5b47c87a6f6b-kube-api-access-nzrs4\") pod \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.928832 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-catalog-content\") pod \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\" (UID: \"316fb3f3-ea57-453f-990a-5b47c87a6f6b\") " Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.930498 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-utilities" (OuterVolumeSpecName: "utilities") pod "316fb3f3-ea57-453f-990a-5b47c87a6f6b" (UID: "316fb3f3-ea57-453f-990a-5b47c87a6f6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:59:39 crc kubenswrapper[4574]: I1004 04:59:39.937397 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316fb3f3-ea57-453f-990a-5b47c87a6f6b-kube-api-access-nzrs4" (OuterVolumeSpecName: "kube-api-access-nzrs4") pod "316fb3f3-ea57-453f-990a-5b47c87a6f6b" (UID: "316fb3f3-ea57-453f-990a-5b47c87a6f6b"). InnerVolumeSpecName "kube-api-access-nzrs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.014634 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "316fb3f3-ea57-453f-990a-5b47c87a6f6b" (UID: "316fb3f3-ea57-453f-990a-5b47c87a6f6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.030192 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.030257 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316fb3f3-ea57-453f-990a-5b47c87a6f6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.030276 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzrs4\" (UniqueName: \"kubernetes.io/projected/316fb3f3-ea57-453f-990a-5b47c87a6f6b-kube-api-access-nzrs4\") on node \"crc\" DevicePath \"\"" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299080 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm"] Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299369 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="util" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299386 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="util" Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299399 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerName="console" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299408 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerName="console" Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299423 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="extract" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299430 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="extract" Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299442 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="pull" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299447 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="pull" Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299456 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="extract-content" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299463 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="extract-content" Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299471 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="registry-server" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299477 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="registry-server" Oct 04 04:59:40 crc kubenswrapper[4574]: E1004 04:59:40.299484 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="extract-utilities" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299490 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="extract-utilities" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299593 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" containerName="registry-server" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299608 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8493ffda-5976-4e28-9927-9bc66b26fccf" containerName="extract" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.299615 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ef4dec-e273-41a2-96de-6c9cc05122d2" containerName="console" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.300093 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.309769 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.310504 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.310613 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.310635 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.312462 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-94426" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.334720 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnc4\" (UniqueName: \"kubernetes.io/projected/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-kube-api-access-hpnc4\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.334790 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-apiservice-cert\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.334870 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-webhook-cert\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.337969 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm"] Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.436551 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-apiservice-cert\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.436661 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-webhook-cert\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.436938 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnc4\" (UniqueName: \"kubernetes.io/projected/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-kube-api-access-hpnc4\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.445485 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-apiservice-cert\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.462914 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-webhook-cert\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.472267 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnc4\" (UniqueName: \"kubernetes.io/projected/cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3-kube-api-access-hpnc4\") pod \"metallb-operator-controller-manager-7956f7d5bc-68jqm\" (UID: \"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3\") " pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.604953 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh"] Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.605943 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.612556 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.612667 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.616592 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.623096 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4l6ld" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.640007 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b23a9bd-b984-4ec1-b18a-9617dad3a194-apiservice-cert\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.640137 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b23a9bd-b984-4ec1-b18a-9617dad3a194-webhook-cert\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.640178 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw96r\" (UniqueName: \"kubernetes.io/projected/0b23a9bd-b984-4ec1-b18a-9617dad3a194-kube-api-access-bw96r\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.680159 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh"] Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.741203 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b23a9bd-b984-4ec1-b18a-9617dad3a194-apiservice-cert\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.741295 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b23a9bd-b984-4ec1-b18a-9617dad3a194-webhook-cert\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.741329 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw96r\" (UniqueName: \"kubernetes.io/projected/0b23a9bd-b984-4ec1-b18a-9617dad3a194-kube-api-access-bw96r\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.750567 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b23a9bd-b984-4ec1-b18a-9617dad3a194-apiservice-cert\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.750881 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b23a9bd-b984-4ec1-b18a-9617dad3a194-webhook-cert\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.767976 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw96r\" (UniqueName: \"kubernetes.io/projected/0b23a9bd-b984-4ec1-b18a-9617dad3a194-kube-api-access-bw96r\") pod \"metallb-operator-webhook-server-78dd4884c9-9rbjh\" (UID: \"0b23a9bd-b984-4ec1-b18a-9617dad3a194\") " pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.815053 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xk4" Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.851030 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7xk4"] Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.862812 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h7xk4"] Oct 04 04:59:40 crc kubenswrapper[4574]: I1004 04:59:40.938162 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:41 crc kubenswrapper[4574]: I1004 04:59:41.250398 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm"] Oct 04 04:59:41 crc kubenswrapper[4574]: I1004 04:59:41.371393 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh"] Oct 04 04:59:41 crc kubenswrapper[4574]: I1004 04:59:41.825865 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" event={"ID":"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3","Type":"ContainerStarted","Data":"4a8ba7d0c85972b2ae44c3698bf478b1ea06a5729c67f72aa8f89a32aebb6c46"} Oct 04 04:59:41 crc kubenswrapper[4574]: I1004 04:59:41.827255 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" event={"ID":"0b23a9bd-b984-4ec1-b18a-9617dad3a194","Type":"ContainerStarted","Data":"c7611fc0295a47c6fb0d224575e89a06997908d8c6b51c252bc42f86122400a8"} Oct 04 04:59:42 crc kubenswrapper[4574]: I1004 04:59:42.747209 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316fb3f3-ea57-453f-990a-5b47c87a6f6b" path="/var/lib/kubelet/pods/316fb3f3-ea57-453f-990a-5b47c87a6f6b/volumes" Oct 04 04:59:48 crc kubenswrapper[4574]: I1004 04:59:48.882656 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" event={"ID":"0b23a9bd-b984-4ec1-b18a-9617dad3a194","Type":"ContainerStarted","Data":"d011f767385f50a9539f07ef1b16481d0d372bc376f9f68eed399e50097518a7"} Oct 04 04:59:48 crc kubenswrapper[4574]: I1004 04:59:48.883197 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 04:59:48 crc kubenswrapper[4574]: I1004 04:59:48.885607 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" event={"ID":"cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3","Type":"ContainerStarted","Data":"76e1b275319e7fa14c1d99a1183d37fbd56340cc33ef561631da89608bfe8ec9"} Oct 04 04:59:48 crc kubenswrapper[4574]: I1004 04:59:48.885765 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 04:59:48 crc kubenswrapper[4574]: I1004 04:59:48.902452 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" podStartSLOduration=2.414458333 podStartE2EDuration="8.902436574s" podCreationTimestamp="2025-10-04 04:59:40 +0000 UTC" firstStartedPulling="2025-10-04 04:59:41.381617495 +0000 UTC m=+807.235760537" lastFinishedPulling="2025-10-04 04:59:47.869595736 +0000 UTC m=+813.723738778" observedRunningTime="2025-10-04 04:59:48.901051474 +0000 UTC m=+814.755194516" watchObservedRunningTime="2025-10-04 04:59:48.902436574 +0000 UTC m=+814.756579616" Oct 04 04:59:48 crc kubenswrapper[4574]: I1004 04:59:48.925992 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" podStartSLOduration=2.309627315 podStartE2EDuration="8.925975738s" podCreationTimestamp="2025-10-04 04:59:40 +0000 UTC" firstStartedPulling="2025-10-04 04:59:41.238171165 +0000 UTC m=+807.092314207" lastFinishedPulling="2025-10-04 04:59:47.854519588 +0000 UTC m=+813.708662630" observedRunningTime="2025-10-04 04:59:48.920506779 +0000 UTC m=+814.774649821" watchObservedRunningTime="2025-10-04 04:59:48.925975738 +0000 UTC m=+814.780118780" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.136191 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft"] Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.137185 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.143546 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.143603 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.159769 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft"] Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.295862 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk6v4\" (UniqueName: \"kubernetes.io/projected/8306ee34-f88b-417d-b2e1-efa57667fdfd-kube-api-access-xk6v4\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.295942 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8306ee34-f88b-417d-b2e1-efa57667fdfd-secret-volume\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.295986 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8306ee34-f88b-417d-b2e1-efa57667fdfd-config-volume\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.397580 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8306ee34-f88b-417d-b2e1-efa57667fdfd-secret-volume\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.397636 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8306ee34-f88b-417d-b2e1-efa57667fdfd-config-volume\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.397718 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk6v4\" (UniqueName: \"kubernetes.io/projected/8306ee34-f88b-417d-b2e1-efa57667fdfd-kube-api-access-xk6v4\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.398910 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8306ee34-f88b-417d-b2e1-efa57667fdfd-config-volume\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.414406 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8306ee34-f88b-417d-b2e1-efa57667fdfd-secret-volume\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.414471 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk6v4\" (UniqueName: \"kubernetes.io/projected/8306ee34-f88b-417d-b2e1-efa57667fdfd-kube-api-access-xk6v4\") pod \"collect-profiles-29325900-vtgft\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.463374 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.902173 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft"] Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.944626 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78dd4884c9-9rbjh" Oct 04 05:00:00 crc kubenswrapper[4574]: I1004 05:00:00.953106 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" event={"ID":"8306ee34-f88b-417d-b2e1-efa57667fdfd","Type":"ContainerStarted","Data":"c5b3bdc7b8f966d96a8701cb84398128c874f3f2f43aad880d976abf673634ae"} Oct 04 05:00:01 crc kubenswrapper[4574]: I1004 05:00:01.961366 4574 generic.go:334] "Generic (PLEG): container finished" podID="8306ee34-f88b-417d-b2e1-efa57667fdfd" containerID="65dc39243397b106ef5cbfffa430b62a1dd2932c69495bdd24f846963625d012" exitCode=0 Oct 04 05:00:01 crc kubenswrapper[4574]: I1004 05:00:01.961437 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" event={"ID":"8306ee34-f88b-417d-b2e1-efa57667fdfd","Type":"ContainerDied","Data":"65dc39243397b106ef5cbfffa430b62a1dd2932c69495bdd24f846963625d012"} Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.403736 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jts65"] Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.404813 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.428621 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jts65"] Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.524425 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-catalog-content\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.524523 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvx94\" (UniqueName: \"kubernetes.io/projected/7fec6701-f179-492a-9f1f-9aa2647a0b43-kube-api-access-rvx94\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.524841 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-utilities\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.627956 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvx94\" (UniqueName: \"kubernetes.io/projected/7fec6701-f179-492a-9f1f-9aa2647a0b43-kube-api-access-rvx94\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.628075 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-utilities\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.628115 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-catalog-content\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.628780 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-catalog-content\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.628782 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-utilities\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.662934 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvx94\" (UniqueName: \"kubernetes.io/projected/7fec6701-f179-492a-9f1f-9aa2647a0b43-kube-api-access-rvx94\") pod \"certified-operators-jts65\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:02 crc kubenswrapper[4574]: I1004 05:00:02.723458 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.259454 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jts65"] Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.386155 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.547682 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk6v4\" (UniqueName: \"kubernetes.io/projected/8306ee34-f88b-417d-b2e1-efa57667fdfd-kube-api-access-xk6v4\") pod \"8306ee34-f88b-417d-b2e1-efa57667fdfd\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.548085 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8306ee34-f88b-417d-b2e1-efa57667fdfd-secret-volume\") pod \"8306ee34-f88b-417d-b2e1-efa57667fdfd\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.548164 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8306ee34-f88b-417d-b2e1-efa57667fdfd-config-volume\") pod \"8306ee34-f88b-417d-b2e1-efa57667fdfd\" (UID: \"8306ee34-f88b-417d-b2e1-efa57667fdfd\") " Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.548743 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8306ee34-f88b-417d-b2e1-efa57667fdfd-config-volume" (OuterVolumeSpecName: "config-volume") pod "8306ee34-f88b-417d-b2e1-efa57667fdfd" (UID: "8306ee34-f88b-417d-b2e1-efa57667fdfd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.557440 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8306ee34-f88b-417d-b2e1-efa57667fdfd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8306ee34-f88b-417d-b2e1-efa57667fdfd" (UID: "8306ee34-f88b-417d-b2e1-efa57667fdfd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.557464 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8306ee34-f88b-417d-b2e1-efa57667fdfd-kube-api-access-xk6v4" (OuterVolumeSpecName: "kube-api-access-xk6v4") pod "8306ee34-f88b-417d-b2e1-efa57667fdfd" (UID: "8306ee34-f88b-417d-b2e1-efa57667fdfd"). InnerVolumeSpecName "kube-api-access-xk6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.650178 4574 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8306ee34-f88b-417d-b2e1-efa57667fdfd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.650473 4574 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8306ee34-f88b-417d-b2e1-efa57667fdfd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.650491 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk6v4\" (UniqueName: \"kubernetes.io/projected/8306ee34-f88b-417d-b2e1-efa57667fdfd-kube-api-access-xk6v4\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.974050 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" event={"ID":"8306ee34-f88b-417d-b2e1-efa57667fdfd","Type":"ContainerDied","Data":"c5b3bdc7b8f966d96a8701cb84398128c874f3f2f43aad880d976abf673634ae"} Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.974153 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5b3bdc7b8f966d96a8701cb84398128c874f3f2f43aad880d976abf673634ae" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.974087 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft" Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.976037 4574 generic.go:334] "Generic (PLEG): container finished" podID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerID="efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9" exitCode=0 Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.976166 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jts65" event={"ID":"7fec6701-f179-492a-9f1f-9aa2647a0b43","Type":"ContainerDied","Data":"efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9"} Oct 04 05:00:03 crc kubenswrapper[4574]: I1004 05:00:03.976268 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jts65" event={"ID":"7fec6701-f179-492a-9f1f-9aa2647a0b43","Type":"ContainerStarted","Data":"b29a1390fa7ea98c9a829f9cfe016e1704aaf510e0c7058a19480383da5df782"} Oct 04 05:00:05 crc kubenswrapper[4574]: I1004 05:00:05.988669 4574 generic.go:334] "Generic (PLEG): container finished" podID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerID="8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050" exitCode=0 Oct 04 05:00:05 crc kubenswrapper[4574]: I1004 05:00:05.988771 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jts65" event={"ID":"7fec6701-f179-492a-9f1f-9aa2647a0b43","Type":"ContainerDied","Data":"8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050"} Oct 04 05:00:06 crc kubenswrapper[4574]: I1004 05:00:06.999505 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jts65" event={"ID":"7fec6701-f179-492a-9f1f-9aa2647a0b43","Type":"ContainerStarted","Data":"1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75"} Oct 04 05:00:07 crc kubenswrapper[4574]: I1004 05:00:07.028148 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jts65" podStartSLOduration=2.307734462 podStartE2EDuration="5.02812353s" podCreationTimestamp="2025-10-04 05:00:02 +0000 UTC" firstStartedPulling="2025-10-04 05:00:03.977505751 +0000 UTC m=+829.831648793" lastFinishedPulling="2025-10-04 05:00:06.697894819 +0000 UTC m=+832.552037861" observedRunningTime="2025-10-04 05:00:07.024753652 +0000 UTC m=+832.878896704" watchObservedRunningTime="2025-10-04 05:00:07.02812353 +0000 UTC m=+832.882266572" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.201487 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7kplc"] Oct 04 05:00:10 crc kubenswrapper[4574]: E1004 05:00:10.202119 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8306ee34-f88b-417d-b2e1-efa57667fdfd" containerName="collect-profiles" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.202140 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8306ee34-f88b-417d-b2e1-efa57667fdfd" containerName="collect-profiles" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.202293 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8306ee34-f88b-417d-b2e1-efa57667fdfd" containerName="collect-profiles" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.203160 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.225210 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7kplc"] Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.345406 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-catalog-content\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.345511 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-utilities\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.345569 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pflms\" (UniqueName: \"kubernetes.io/projected/6659cf08-46c3-476a-aaf1-52cb628910a9-kube-api-access-pflms\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.446564 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pflms\" (UniqueName: \"kubernetes.io/projected/6659cf08-46c3-476a-aaf1-52cb628910a9-kube-api-access-pflms\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.446962 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-catalog-content\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.447133 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-utilities\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.447636 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-catalog-content\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.447691 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-utilities\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.472165 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pflms\" (UniqueName: \"kubernetes.io/projected/6659cf08-46c3-476a-aaf1-52cb628910a9-kube-api-access-pflms\") pod \"community-operators-7kplc\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.545704 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:10 crc kubenswrapper[4574]: I1004 05:00:10.878278 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7kplc"] Oct 04 05:00:11 crc kubenswrapper[4574]: I1004 05:00:11.021863 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kplc" event={"ID":"6659cf08-46c3-476a-aaf1-52cb628910a9","Type":"ContainerStarted","Data":"55bc1d25a963310df617a18d043f3aadb762994247a01cd98956638b1e190b53"} Oct 04 05:00:12 crc kubenswrapper[4574]: I1004 05:00:12.031490 4574 generic.go:334] "Generic (PLEG): container finished" podID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerID="c31bb11ac36bd84d6cb84cce98e5ce84e99fa47c909a105e60c8a7f56b8db2f5" exitCode=0 Oct 04 05:00:12 crc kubenswrapper[4574]: I1004 05:00:12.031601 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kplc" event={"ID":"6659cf08-46c3-476a-aaf1-52cb628910a9","Type":"ContainerDied","Data":"c31bb11ac36bd84d6cb84cce98e5ce84e99fa47c909a105e60c8a7f56b8db2f5"} Oct 04 05:00:12 crc kubenswrapper[4574]: I1004 05:00:12.724518 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:12 crc kubenswrapper[4574]: I1004 05:00:12.724581 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:12 crc kubenswrapper[4574]: I1004 05:00:12.761445 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:13 crc kubenswrapper[4574]: I1004 05:00:13.084738 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:14 crc kubenswrapper[4574]: I1004 05:00:14.048600 4574 generic.go:334] "Generic (PLEG): container finished" podID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerID="a653ae787654a84c9c7676249c2e04da642f262d1182768fca75f0e86f14c703" exitCode=0 Oct 04 05:00:14 crc kubenswrapper[4574]: I1004 05:00:14.048647 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kplc" event={"ID":"6659cf08-46c3-476a-aaf1-52cb628910a9","Type":"ContainerDied","Data":"a653ae787654a84c9c7676249c2e04da642f262d1182768fca75f0e86f14c703"} Oct 04 05:00:15 crc kubenswrapper[4574]: I1004 05:00:15.056645 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kplc" event={"ID":"6659cf08-46c3-476a-aaf1-52cb628910a9","Type":"ContainerStarted","Data":"8d5f0cd557be5a4cd72faa061823ce6fe5d0c6a68e12a40b8e8d8b9f361cf960"} Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.197311 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7kplc" podStartSLOduration=3.721574925 podStartE2EDuration="6.19728867s" podCreationTimestamp="2025-10-04 05:00:10 +0000 UTC" firstStartedPulling="2025-10-04 05:00:12.033344994 +0000 UTC m=+837.887488036" lastFinishedPulling="2025-10-04 05:00:14.509058739 +0000 UTC m=+840.363201781" observedRunningTime="2025-10-04 05:00:15.085740265 +0000 UTC m=+840.939883307" watchObservedRunningTime="2025-10-04 05:00:16.19728867 +0000 UTC m=+842.051431722" Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.199740 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jts65"] Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.199993 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jts65" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="registry-server" containerID="cri-o://1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75" gracePeriod=2 Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.597382 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.729109 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-utilities\") pod \"7fec6701-f179-492a-9f1f-9aa2647a0b43\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.729290 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvx94\" (UniqueName: \"kubernetes.io/projected/7fec6701-f179-492a-9f1f-9aa2647a0b43-kube-api-access-rvx94\") pod \"7fec6701-f179-492a-9f1f-9aa2647a0b43\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.729330 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-catalog-content\") pod \"7fec6701-f179-492a-9f1f-9aa2647a0b43\" (UID: \"7fec6701-f179-492a-9f1f-9aa2647a0b43\") " Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.730489 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-utilities" (OuterVolumeSpecName: "utilities") pod "7fec6701-f179-492a-9f1f-9aa2647a0b43" (UID: "7fec6701-f179-492a-9f1f-9aa2647a0b43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.737437 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fec6701-f179-492a-9f1f-9aa2647a0b43-kube-api-access-rvx94" (OuterVolumeSpecName: "kube-api-access-rvx94") pod "7fec6701-f179-492a-9f1f-9aa2647a0b43" (UID: "7fec6701-f179-492a-9f1f-9aa2647a0b43"). InnerVolumeSpecName "kube-api-access-rvx94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.830515 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:16 crc kubenswrapper[4574]: I1004 05:00:16.830767 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvx94\" (UniqueName: \"kubernetes.io/projected/7fec6701-f179-492a-9f1f-9aa2647a0b43-kube-api-access-rvx94\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.076910 4574 generic.go:334] "Generic (PLEG): container finished" podID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerID="1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75" exitCode=0 Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.076974 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jts65" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.076981 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jts65" event={"ID":"7fec6701-f179-492a-9f1f-9aa2647a0b43","Type":"ContainerDied","Data":"1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75"} Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.077097 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jts65" event={"ID":"7fec6701-f179-492a-9f1f-9aa2647a0b43","Type":"ContainerDied","Data":"b29a1390fa7ea98c9a829f9cfe016e1704aaf510e0c7058a19480383da5df782"} Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.077118 4574 scope.go:117] "RemoveContainer" containerID="1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.093654 4574 scope.go:117] "RemoveContainer" containerID="8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.107507 4574 scope.go:117] "RemoveContainer" containerID="efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.122375 4574 scope.go:117] "RemoveContainer" containerID="1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75" Oct 04 05:00:17 crc kubenswrapper[4574]: E1004 05:00:17.122734 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75\": container with ID starting with 1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75 not found: ID does not exist" containerID="1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.122771 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75"} err="failed to get container status \"1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75\": rpc error: code = NotFound desc = could not find container \"1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75\": container with ID starting with 1d1067d2a5963f93b8993207ce29dfe35e3dfe850a4b0b4300843851619bea75 not found: ID does not exist" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.122794 4574 scope.go:117] "RemoveContainer" containerID="8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050" Oct 04 05:00:17 crc kubenswrapper[4574]: E1004 05:00:17.123287 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050\": container with ID starting with 8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050 not found: ID does not exist" containerID="8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.123339 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050"} err="failed to get container status \"8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050\": rpc error: code = NotFound desc = could not find container \"8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050\": container with ID starting with 8adc100b115f776f000283eb43920ed75d444b209d29c8af30ee05e5d9d12050 not found: ID does not exist" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.123371 4574 scope.go:117] "RemoveContainer" containerID="efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9" Oct 04 05:00:17 crc kubenswrapper[4574]: E1004 05:00:17.123661 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9\": container with ID starting with efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9 not found: ID does not exist" containerID="efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.123686 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9"} err="failed to get container status \"efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9\": rpc error: code = NotFound desc = could not find container \"efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9\": container with ID starting with efdb9cbd0562928a152462c4f58ea925b9bb9f74ce69d2c4cbb28342bfe62dd9 not found: ID does not exist" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.465433 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fec6701-f179-492a-9f1f-9aa2647a0b43" (UID: "7fec6701-f179-492a-9f1f-9aa2647a0b43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.540551 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fec6701-f179-492a-9f1f-9aa2647a0b43-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.709374 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jts65"] Oct 04 05:00:17 crc kubenswrapper[4574]: I1004 05:00:17.715494 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jts65"] Oct 04 05:00:18 crc kubenswrapper[4574]: I1004 05:00:18.740857 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" path="/var/lib/kubelet/pods/7fec6701-f179-492a-9f1f-9aa2647a0b43/volumes" Oct 04 05:00:20 crc kubenswrapper[4574]: I1004 05:00:20.546787 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:20 crc kubenswrapper[4574]: I1004 05:00:20.547169 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:20 crc kubenswrapper[4574]: I1004 05:00:20.586659 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:20 crc kubenswrapper[4574]: I1004 05:00:20.620086 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7956f7d5bc-68jqm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.145460 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.377126 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qfz6d"] Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.377589 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="extract-utilities" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.377613 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="extract-utilities" Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.377647 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="registry-server" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.377656 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="registry-server" Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.377665 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="extract-content" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.377673 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="extract-content" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.377832 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fec6701-f179-492a-9f1f-9aa2647a0b43" containerName="registry-server" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.380603 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7"] Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.381217 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.381843 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.385720 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.385882 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.385884 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9jzwd" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.385809 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392403 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-conf\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392459 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-metrics-certs\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392504 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-reloader\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392530 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpnb\" (UniqueName: \"kubernetes.io/projected/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-kube-api-access-tkpnb\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392568 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-startup\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392628 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-sockets\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392673 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zb8z\" (UniqueName: \"kubernetes.io/projected/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-kube-api-access-4zb8z\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392706 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-metrics\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.392727 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.412722 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7"] Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493628 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-startup\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493697 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-sockets\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493739 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zb8z\" (UniqueName: \"kubernetes.io/projected/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-kube-api-access-4zb8z\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493783 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-metrics\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493804 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493836 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-conf\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493862 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-metrics-certs\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493897 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-reloader\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.493916 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpnb\" (UniqueName: \"kubernetes.io/projected/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-kube-api-access-tkpnb\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.494153 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-sockets\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.494461 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-reloader\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.494464 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-conf\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.494613 4574 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.494711 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-cert podName:d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13 nodeName:}" failed. No retries permitted until 2025-10-04 05:00:21.99468279 +0000 UTC m=+847.848825912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-cert") pod "frr-k8s-webhook-server-64bf5d555-2lxv7" (UID: "d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13") : secret "frr-k8s-webhook-server-cert" not found Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.494683 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-metrics\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.495309 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-frr-startup\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.499659 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-metrics-certs\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.524060 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpnb\" (UniqueName: \"kubernetes.io/projected/54b0a1bb-eb0c-4ff2-b41d-966594fe7504-kube-api-access-tkpnb\") pod \"frr-k8s-qfz6d\" (UID: \"54b0a1bb-eb0c-4ff2-b41d-966594fe7504\") " pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.526013 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zb8z\" (UniqueName: \"kubernetes.io/projected/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-kube-api-access-4zb8z\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.531325 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pjq5j"] Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.532572 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.538251 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.538290 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.539703 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9hk87" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.541777 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.569857 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-fl9dm"] Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.571040 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.573118 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.591731 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-fl9dm"] Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.696539 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.696663 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-metrics-certs\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.696715 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metrics-certs\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.696751 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metallb-excludel2\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.696796 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkch2\" (UniqueName: \"kubernetes.io/projected/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-kube-api-access-dkch2\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.696826 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-cert\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.697002 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xghtw\" (UniqueName: \"kubernetes.io/projected/7de5a0bd-8082-40f2-9288-2c5417547a96-kube-api-access-xghtw\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.708342 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.797724 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkch2\" (UniqueName: \"kubernetes.io/projected/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-kube-api-access-dkch2\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.798403 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-cert\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.798498 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xghtw\" (UniqueName: \"kubernetes.io/projected/7de5a0bd-8082-40f2-9288-2c5417547a96-kube-api-access-xghtw\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.798608 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.798760 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-metrics-certs\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.798883 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metrics-certs\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.799019 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metallb-excludel2\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.798752 4574 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.798989 4574 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.799857 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metallb-excludel2\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.799860 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-metrics-certs podName:7de5a0bd-8082-40f2-9288-2c5417547a96 nodeName:}" failed. No retries permitted until 2025-10-04 05:00:22.299840325 +0000 UTC m=+848.153983367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-metrics-certs") pod "controller-68d546b9d8-fl9dm" (UID: "7de5a0bd-8082-40f2-9288-2c5417547a96") : secret "controller-certs-secret" not found Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.799873 4574 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.799939 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metrics-certs podName:d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb nodeName:}" failed. No retries permitted until 2025-10-04 05:00:22.299921187 +0000 UTC m=+848.154064299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metrics-certs") pod "speaker-pjq5j" (UID: "d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb") : secret "speaker-certs-secret" not found Oct 04 05:00:21 crc kubenswrapper[4574]: E1004 05:00:21.800097 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist podName:d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb nodeName:}" failed. No retries permitted until 2025-10-04 05:00:22.300085482 +0000 UTC m=+848.154228624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist") pod "speaker-pjq5j" (UID: "d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb") : secret "metallb-memberlist" not found Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.805427 4574 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.813481 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-cert\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.825902 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkch2\" (UniqueName: \"kubernetes.io/projected/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-kube-api-access-dkch2\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:21 crc kubenswrapper[4574]: I1004 05:00:21.827756 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xghtw\" (UniqueName: \"kubernetes.io/projected/7de5a0bd-8082-40f2-9288-2c5417547a96-kube-api-access-xghtw\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.001829 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.005494 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2lxv7\" (UID: \"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.109787 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"91b30b36b25cd1e71fe906595d17adc84f9b936521d192508b2df847101cb77b"} Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.299397 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.305027 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.305133 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-metrics-certs\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:22 crc kubenswrapper[4574]: E1004 05:00:22.305193 4574 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 04 05:00:22 crc kubenswrapper[4574]: E1004 05:00:22.305279 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist podName:d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb nodeName:}" failed. No retries permitted until 2025-10-04 05:00:23.305260116 +0000 UTC m=+849.159403158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist") pod "speaker-pjq5j" (UID: "d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb") : secret "metallb-memberlist" not found Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.305655 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metrics-certs\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.308377 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7de5a0bd-8082-40f2-9288-2c5417547a96-metrics-certs\") pod \"controller-68d546b9d8-fl9dm\" (UID: \"7de5a0bd-8082-40f2-9288-2c5417547a96\") " pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.308512 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-metrics-certs\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.485505 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.611023 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-swvpl"] Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.613824 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.638487 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvpl"] Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.713632 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-catalog-content\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.714226 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2pg\" (UniqueName: \"kubernetes.io/projected/ea64aef1-a36e-4013-978c-c6ddb4ea3626-kube-api-access-hp2pg\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.714441 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-utilities\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.745733 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7"] Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.807994 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-fl9dm"] Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.816037 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2pg\" (UniqueName: \"kubernetes.io/projected/ea64aef1-a36e-4013-978c-c6ddb4ea3626-kube-api-access-hp2pg\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.816185 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-utilities\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.816270 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-catalog-content\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.816921 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-catalog-content\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.816955 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-utilities\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.866619 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2pg\" (UniqueName: \"kubernetes.io/projected/ea64aef1-a36e-4013-978c-c6ddb4ea3626-kube-api-access-hp2pg\") pod \"redhat-marketplace-swvpl\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:22 crc kubenswrapper[4574]: I1004 05:00:22.934376 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.135311 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" event={"ID":"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13","Type":"ContainerStarted","Data":"26afc5baf0eaf795c46536b761d4bc56c9fdaab4e943a90beda5b1c8b596e65f"} Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.149366 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-fl9dm" event={"ID":"7de5a0bd-8082-40f2-9288-2c5417547a96","Type":"ContainerStarted","Data":"4873f074b8b2a1148730a9b3fb60006305c94ce70d20734d99f1990691135d03"} Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.149413 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-fl9dm" event={"ID":"7de5a0bd-8082-40f2-9288-2c5417547a96","Type":"ContainerStarted","Data":"174c986e3d38b6593b5f551875e0a29b65ad4a280f82a4a16aea75d7ef6ec83c"} Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.329124 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.338116 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb-memberlist\") pod \"speaker-pjq5j\" (UID: \"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb\") " pod="metallb-system/speaker-pjq5j" Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.373610 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvpl"] Oct 04 05:00:23 crc kubenswrapper[4574]: I1004 05:00:23.373941 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pjq5j" Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.158271 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-fl9dm" event={"ID":"7de5a0bd-8082-40f2-9288-2c5417547a96","Type":"ContainerStarted","Data":"e7a1c216c0a104e2a4f7cc29eab6ba64826ededcea709254590a7fc7c540dfc4"} Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.158923 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.165166 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pjq5j" event={"ID":"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb","Type":"ContainerStarted","Data":"7bb478057201ae1f71c462885d5ed310a617b9eac542d77281a2737a803433cf"} Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.165248 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pjq5j" event={"ID":"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb","Type":"ContainerStarted","Data":"69d1ccea5775d0d93b40f3211fe6e6ecd45c43692d68c458c47a2af8a75b9b5f"} Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.165264 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pjq5j" event={"ID":"d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb","Type":"ContainerStarted","Data":"1b4ca50b0dc5fd1f962791b71279d4c68bcdcebb7db435582997fa928f7dbad0"} Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.166065 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pjq5j" Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.168613 4574 generic.go:334] "Generic (PLEG): container finished" podID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerID="86978f3b510286a911471b8470c8ab2beab3b2e37cc43169191613f942a4f73d" exitCode=0 Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.168654 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerDied","Data":"86978f3b510286a911471b8470c8ab2beab3b2e37cc43169191613f942a4f73d"} Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.168792 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerStarted","Data":"a47c24d75866b7683fd494eea41af56778a9a0f58c5f957980f1e89ce03aa0b6"} Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.184053 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-fl9dm" podStartSLOduration=3.18403445 podStartE2EDuration="3.18403445s" podCreationTimestamp="2025-10-04 05:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:00:24.176069248 +0000 UTC m=+850.030212300" watchObservedRunningTime="2025-10-04 05:00:24.18403445 +0000 UTC m=+850.038177492" Oct 04 05:00:24 crc kubenswrapper[4574]: I1004 05:00:24.216341 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pjq5j" podStartSLOduration=3.216319828 podStartE2EDuration="3.216319828s" podCreationTimestamp="2025-10-04 05:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:00:24.21535972 +0000 UTC m=+850.069502762" watchObservedRunningTime="2025-10-04 05:00:24.216319828 +0000 UTC m=+850.070462890" Oct 04 05:00:25 crc kubenswrapper[4574]: I1004 05:00:25.183183 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerStarted","Data":"a3559f2f28a89ad3c4b1e0c1a9289108b039a0a6633bf99e285f59f530402fe8"} Oct 04 05:00:25 crc kubenswrapper[4574]: I1004 05:00:25.798822 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7kplc"] Oct 04 05:00:25 crc kubenswrapper[4574]: I1004 05:00:25.799441 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7kplc" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="registry-server" containerID="cri-o://8d5f0cd557be5a4cd72faa061823ce6fe5d0c6a68e12a40b8e8d8b9f361cf960" gracePeriod=2 Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.224757 4574 generic.go:334] "Generic (PLEG): container finished" podID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerID="8d5f0cd557be5a4cd72faa061823ce6fe5d0c6a68e12a40b8e8d8b9f361cf960" exitCode=0 Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.224839 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kplc" event={"ID":"6659cf08-46c3-476a-aaf1-52cb628910a9","Type":"ContainerDied","Data":"8d5f0cd557be5a4cd72faa061823ce6fe5d0c6a68e12a40b8e8d8b9f361cf960"} Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.235433 4574 generic.go:334] "Generic (PLEG): container finished" podID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerID="a3559f2f28a89ad3c4b1e0c1a9289108b039a0a6633bf99e285f59f530402fe8" exitCode=0 Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.235486 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerDied","Data":"a3559f2f28a89ad3c4b1e0c1a9289108b039a0a6633bf99e285f59f530402fe8"} Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.366862 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.474824 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pflms\" (UniqueName: \"kubernetes.io/projected/6659cf08-46c3-476a-aaf1-52cb628910a9-kube-api-access-pflms\") pod \"6659cf08-46c3-476a-aaf1-52cb628910a9\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.474870 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-utilities\") pod \"6659cf08-46c3-476a-aaf1-52cb628910a9\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.474961 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-catalog-content\") pod \"6659cf08-46c3-476a-aaf1-52cb628910a9\" (UID: \"6659cf08-46c3-476a-aaf1-52cb628910a9\") " Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.475779 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-utilities" (OuterVolumeSpecName: "utilities") pod "6659cf08-46c3-476a-aaf1-52cb628910a9" (UID: "6659cf08-46c3-476a-aaf1-52cb628910a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.495451 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6659cf08-46c3-476a-aaf1-52cb628910a9-kube-api-access-pflms" (OuterVolumeSpecName: "kube-api-access-pflms") pod "6659cf08-46c3-476a-aaf1-52cb628910a9" (UID: "6659cf08-46c3-476a-aaf1-52cb628910a9"). InnerVolumeSpecName "kube-api-access-pflms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.532406 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6659cf08-46c3-476a-aaf1-52cb628910a9" (UID: "6659cf08-46c3-476a-aaf1-52cb628910a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.576700 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pflms\" (UniqueName: \"kubernetes.io/projected/6659cf08-46c3-476a-aaf1-52cb628910a9-kube-api-access-pflms\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.576740 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:26 crc kubenswrapper[4574]: I1004 05:00:26.576752 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6659cf08-46c3-476a-aaf1-52cb628910a9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.251130 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7kplc" event={"ID":"6659cf08-46c3-476a-aaf1-52cb628910a9","Type":"ContainerDied","Data":"55bc1d25a963310df617a18d043f3aadb762994247a01cd98956638b1e190b53"} Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.251496 4574 scope.go:117] "RemoveContainer" containerID="8d5f0cd557be5a4cd72faa061823ce6fe5d0c6a68e12a40b8e8d8b9f361cf960" Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.251635 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7kplc" Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.259490 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerStarted","Data":"671fe3cc1d1ddb2fa4bc3bde6556a842be851ef12dbe6b4f741fa81ebeb42a88"} Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.280635 4574 scope.go:117] "RemoveContainer" containerID="a653ae787654a84c9c7676249c2e04da642f262d1182768fca75f0e86f14c703" Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.317413 4574 scope.go:117] "RemoveContainer" containerID="c31bb11ac36bd84d6cb84cce98e5ce84e99fa47c909a105e60c8a7f56b8db2f5" Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.328224 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7kplc"] Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.353523 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7kplc"] Oct 04 05:00:27 crc kubenswrapper[4574]: I1004 05:00:27.427465 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-swvpl" podStartSLOduration=2.888606935 podStartE2EDuration="5.427444983s" podCreationTimestamp="2025-10-04 05:00:22 +0000 UTC" firstStartedPulling="2025-10-04 05:00:24.170340962 +0000 UTC m=+850.024484004" lastFinishedPulling="2025-10-04 05:00:26.709179 +0000 UTC m=+852.563322052" observedRunningTime="2025-10-04 05:00:27.420684807 +0000 UTC m=+853.274827849" watchObservedRunningTime="2025-10-04 05:00:27.427444983 +0000 UTC m=+853.281588025" Oct 04 05:00:28 crc kubenswrapper[4574]: I1004 05:00:28.746217 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" path="/var/lib/kubelet/pods/6659cf08-46c3-476a-aaf1-52cb628910a9/volumes" Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.302114 4574 generic.go:334] "Generic (PLEG): container finished" podID="54b0a1bb-eb0c-4ff2-b41d-966594fe7504" containerID="9ba5f6d5ea999db299adca6d130aa55274bdafb821862dc06be67fb1c2d6f9cf" exitCode=0 Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.302165 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerDied","Data":"9ba5f6d5ea999db299adca6d130aa55274bdafb821862dc06be67fb1c2d6f9cf"} Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.303765 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" event={"ID":"d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13","Type":"ContainerStarted","Data":"0ad362487a8ed844964642fe0450634e720d07343995578537f202b1d349032a"} Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.303903 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.934840 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.935259 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.976301 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:32 crc kubenswrapper[4574]: I1004 05:00:32.995555 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" podStartSLOduration=2.778914183 podStartE2EDuration="11.995539223s" podCreationTimestamp="2025-10-04 05:00:21 +0000 UTC" firstStartedPulling="2025-10-04 05:00:22.765380752 +0000 UTC m=+848.619523794" lastFinishedPulling="2025-10-04 05:00:31.982005792 +0000 UTC m=+857.836148834" observedRunningTime="2025-10-04 05:00:32.34995743 +0000 UTC m=+858.204100472" watchObservedRunningTime="2025-10-04 05:00:32.995539223 +0000 UTC m=+858.849682265" Oct 04 05:00:33 crc kubenswrapper[4574]: I1004 05:00:33.311136 4574 generic.go:334] "Generic (PLEG): container finished" podID="54b0a1bb-eb0c-4ff2-b41d-966594fe7504" containerID="8e0fe430bc4493e6726ce6bf02cd5a1f724a16cde030d34adcb46fc986e2a8c1" exitCode=0 Oct 04 05:00:33 crc kubenswrapper[4574]: I1004 05:00:33.311216 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerDied","Data":"8e0fe430bc4493e6726ce6bf02cd5a1f724a16cde030d34adcb46fc986e2a8c1"} Oct 04 05:00:33 crc kubenswrapper[4574]: I1004 05:00:33.359041 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:33 crc kubenswrapper[4574]: I1004 05:00:33.380885 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pjq5j" Oct 04 05:00:34 crc kubenswrapper[4574]: I1004 05:00:34.319420 4574 generic.go:334] "Generic (PLEG): container finished" podID="54b0a1bb-eb0c-4ff2-b41d-966594fe7504" containerID="a0624bbfb43bd17f5f9fe1c495c7f1044a0fc1087336de96167933201c6936be" exitCode=0 Oct 04 05:00:34 crc kubenswrapper[4574]: I1004 05:00:34.319613 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerDied","Data":"a0624bbfb43bd17f5f9fe1c495c7f1044a0fc1087336de96167933201c6936be"} Oct 04 05:00:34 crc kubenswrapper[4574]: I1004 05:00:34.793612 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvpl"] Oct 04 05:00:35 crc kubenswrapper[4574]: I1004 05:00:35.331379 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"9d78a82056fbe925abb17512f40d4eac0b130849ffbc61a9cc7efc5a39a4c694"} Oct 04 05:00:35 crc kubenswrapper[4574]: I1004 05:00:35.331986 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"b9deee71aadda2263bed9a9d228d3c3ac988ec47162be2bf0827715dfa694f5d"} Oct 04 05:00:35 crc kubenswrapper[4574]: I1004 05:00:35.332000 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"f70c4012ad294a8a5e548e2f24cc0dd7ff6396fdc0ad4ffaf2a70d253be2f1d6"} Oct 04 05:00:35 crc kubenswrapper[4574]: I1004 05:00:35.332011 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"f27098800cf918552c160f401f384789129f734f2ed52f47e4a2e824575ac078"} Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.342473 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"516684a322312ade031e09de9e0ce75f6282e697dcbef93f0e2565e61886a418"} Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.342522 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qfz6d" event={"ID":"54b0a1bb-eb0c-4ff2-b41d-966594fe7504","Type":"ContainerStarted","Data":"37bb38c2d552471017b8b3865d5916517dd4d1ad088de619db594e71748aed01"} Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.342652 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-swvpl" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="registry-server" containerID="cri-o://671fe3cc1d1ddb2fa4bc3bde6556a842be851ef12dbe6b4f741fa81ebeb42a88" gracePeriod=2 Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.342782 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.373837 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qfz6d" podStartSLOduration=5.287354297 podStartE2EDuration="15.373816905s" podCreationTimestamp="2025-10-04 05:00:21 +0000 UTC" firstStartedPulling="2025-10-04 05:00:21.920840759 +0000 UTC m=+847.774983801" lastFinishedPulling="2025-10-04 05:00:32.007303367 +0000 UTC m=+857.861446409" observedRunningTime="2025-10-04 05:00:36.369931432 +0000 UTC m=+862.224074474" watchObservedRunningTime="2025-10-04 05:00:36.373816905 +0000 UTC m=+862.227959947" Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.708609 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:36 crc kubenswrapper[4574]: I1004 05:00:36.837085 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:37 crc kubenswrapper[4574]: I1004 05:00:37.349877 4574 generic.go:334] "Generic (PLEG): container finished" podID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerID="671fe3cc1d1ddb2fa4bc3bde6556a842be851ef12dbe6b4f741fa81ebeb42a88" exitCode=0 Oct 04 05:00:37 crc kubenswrapper[4574]: I1004 05:00:37.349940 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerDied","Data":"671fe3cc1d1ddb2fa4bc3bde6556a842be851ef12dbe6b4f741fa81ebeb42a88"} Oct 04 05:00:37 crc kubenswrapper[4574]: I1004 05:00:37.868040 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.031848 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2pg\" (UniqueName: \"kubernetes.io/projected/ea64aef1-a36e-4013-978c-c6ddb4ea3626-kube-api-access-hp2pg\") pod \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.032003 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-utilities\") pod \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.032028 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-catalog-content\") pod \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\" (UID: \"ea64aef1-a36e-4013-978c-c6ddb4ea3626\") " Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.033005 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-utilities" (OuterVolumeSpecName: "utilities") pod "ea64aef1-a36e-4013-978c-c6ddb4ea3626" (UID: "ea64aef1-a36e-4013-978c-c6ddb4ea3626"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.042877 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea64aef1-a36e-4013-978c-c6ddb4ea3626-kube-api-access-hp2pg" (OuterVolumeSpecName: "kube-api-access-hp2pg") pod "ea64aef1-a36e-4013-978c-c6ddb4ea3626" (UID: "ea64aef1-a36e-4013-978c-c6ddb4ea3626"). InnerVolumeSpecName "kube-api-access-hp2pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.046388 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea64aef1-a36e-4013-978c-c6ddb4ea3626" (UID: "ea64aef1-a36e-4013-978c-c6ddb4ea3626"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.133284 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.133566 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea64aef1-a36e-4013-978c-c6ddb4ea3626-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.133649 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2pg\" (UniqueName: \"kubernetes.io/projected/ea64aef1-a36e-4013-978c-c6ddb4ea3626-kube-api-access-hp2pg\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.359272 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvpl" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.359271 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvpl" event={"ID":"ea64aef1-a36e-4013-978c-c6ddb4ea3626","Type":"ContainerDied","Data":"a47c24d75866b7683fd494eea41af56778a9a0f58c5f957980f1e89ce03aa0b6"} Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.359689 4574 scope.go:117] "RemoveContainer" containerID="671fe3cc1d1ddb2fa4bc3bde6556a842be851ef12dbe6b4f741fa81ebeb42a88" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.381035 4574 scope.go:117] "RemoveContainer" containerID="a3559f2f28a89ad3c4b1e0c1a9289108b039a0a6633bf99e285f59f530402fe8" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.390462 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvpl"] Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.401314 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvpl"] Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.402308 4574 scope.go:117] "RemoveContainer" containerID="86978f3b510286a911471b8470c8ab2beab3b2e37cc43169191613f942a4f73d" Oct 04 05:00:38 crc kubenswrapper[4574]: I1004 05:00:38.749857 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" path="/var/lib/kubelet/pods/ea64aef1-a36e-4013-978c-c6ddb4ea3626/volumes" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.003747 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nk4xx"] Oct 04 05:00:42 crc kubenswrapper[4574]: E1004 05:00:42.005445 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="extract-content" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.005563 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="extract-content" Oct 04 05:00:42 crc kubenswrapper[4574]: E1004 05:00:42.005646 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="extract-utilities" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.005731 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="extract-utilities" Oct 04 05:00:42 crc kubenswrapper[4574]: E1004 05:00:42.005817 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="registry-server" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.005897 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="registry-server" Oct 04 05:00:42 crc kubenswrapper[4574]: E1004 05:00:42.005985 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="extract-utilities" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.006055 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="extract-utilities" Oct 04 05:00:42 crc kubenswrapper[4574]: E1004 05:00:42.006144 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="extract-content" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.006220 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="extract-content" Oct 04 05:00:42 crc kubenswrapper[4574]: E1004 05:00:42.006333 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="registry-server" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.006408 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="registry-server" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.006631 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea64aef1-a36e-4013-978c-c6ddb4ea3626" containerName="registry-server" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.007874 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="6659cf08-46c3-476a-aaf1-52cb628910a9" containerName="registry-server" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.008543 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.011482 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.011725 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.015055 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nk4xx"] Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.016619 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5bmlf" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.194306 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qx6q\" (UniqueName: \"kubernetes.io/projected/116021ce-1084-4c34-b4b8-9499015e58c0-kube-api-access-8qx6q\") pod \"openstack-operator-index-nk4xx\" (UID: \"116021ce-1084-4c34-b4b8-9499015e58c0\") " pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.296159 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qx6q\" (UniqueName: \"kubernetes.io/projected/116021ce-1084-4c34-b4b8-9499015e58c0-kube-api-access-8qx6q\") pod \"openstack-operator-index-nk4xx\" (UID: \"116021ce-1084-4c34-b4b8-9499015e58c0\") " pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.304504 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2lxv7" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.315925 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qx6q\" (UniqueName: \"kubernetes.io/projected/116021ce-1084-4c34-b4b8-9499015e58c0-kube-api-access-8qx6q\") pod \"openstack-operator-index-nk4xx\" (UID: \"116021ce-1084-4c34-b4b8-9499015e58c0\") " pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.324033 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.492986 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-fl9dm" Oct 04 05:00:42 crc kubenswrapper[4574]: I1004 05:00:42.561677 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nk4xx"] Oct 04 05:00:43 crc kubenswrapper[4574]: I1004 05:00:43.399003 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nk4xx" event={"ID":"116021ce-1084-4c34-b4b8-9499015e58c0","Type":"ContainerStarted","Data":"f0a87f75900168044c36b72fe3300d6a054e2c6e67703fdb39cbdee3f8d3293c"} Oct 04 05:00:46 crc kubenswrapper[4574]: I1004 05:00:46.417389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nk4xx" event={"ID":"116021ce-1084-4c34-b4b8-9499015e58c0","Type":"ContainerStarted","Data":"6b47c0f75791fa1b2826fd86bf3cce6ec35c8f7116df83223316a3ad2c877119"} Oct 04 05:00:46 crc kubenswrapper[4574]: I1004 05:00:46.432704 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nk4xx" podStartSLOduration=2.604619433 podStartE2EDuration="5.432688462s" podCreationTimestamp="2025-10-04 05:00:41 +0000 UTC" firstStartedPulling="2025-10-04 05:00:42.570831503 +0000 UTC m=+868.424974535" lastFinishedPulling="2025-10-04 05:00:45.398900522 +0000 UTC m=+871.253043564" observedRunningTime="2025-10-04 05:00:46.432382593 +0000 UTC m=+872.286525635" watchObservedRunningTime="2025-10-04 05:00:46.432688462 +0000 UTC m=+872.286831504" Oct 04 05:00:51 crc kubenswrapper[4574]: I1004 05:00:51.710354 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qfz6d" Oct 04 05:00:52 crc kubenswrapper[4574]: I1004 05:00:52.330455 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:52 crc kubenswrapper[4574]: I1004 05:00:52.330918 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:52 crc kubenswrapper[4574]: I1004 05:00:52.358345 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:52 crc kubenswrapper[4574]: I1004 05:00:52.479205 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nk4xx" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.038501 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94"] Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.040484 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.043153 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bwvsx" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.054370 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94"] Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.155297 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-util\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.155431 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-bundle\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.155574 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r96\" (UniqueName: \"kubernetes.io/projected/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-kube-api-access-m9r96\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.256527 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r96\" (UniqueName: \"kubernetes.io/projected/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-kube-api-access-m9r96\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.256825 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-util\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.256908 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-bundle\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.257521 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-util\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.257753 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-bundle\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.278701 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r96\" (UniqueName: \"kubernetes.io/projected/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-kube-api-access-m9r96\") pod \"dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.397671 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:54 crc kubenswrapper[4574]: I1004 05:00:54.796018 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94"] Oct 04 05:00:55 crc kubenswrapper[4574]: I1004 05:00:55.471543 4574 generic.go:334] "Generic (PLEG): container finished" podID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerID="c08d71cbd01ca6fe3beb60bb92d09f66f1783c5aeaf230093f66d243183b5cb3" exitCode=0 Oct 04 05:00:55 crc kubenswrapper[4574]: I1004 05:00:55.471886 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" event={"ID":"f42d7d5a-0727-4798-96da-ae6e57b9f3c5","Type":"ContainerDied","Data":"c08d71cbd01ca6fe3beb60bb92d09f66f1783c5aeaf230093f66d243183b5cb3"} Oct 04 05:00:55 crc kubenswrapper[4574]: I1004 05:00:55.471923 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" event={"ID":"f42d7d5a-0727-4798-96da-ae6e57b9f3c5","Type":"ContainerStarted","Data":"818ad103436131a57b5264729f0c90792e2efea902bbc57ce0b0099a8c583bec"} Oct 04 05:00:56 crc kubenswrapper[4574]: I1004 05:00:56.480407 4574 generic.go:334] "Generic (PLEG): container finished" podID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerID="58236e2c6979a5918eb97d46f7b00e1a0fb941a6dc4f318c870bdfe32b13a275" exitCode=0 Oct 04 05:00:56 crc kubenswrapper[4574]: I1004 05:00:56.480781 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" event={"ID":"f42d7d5a-0727-4798-96da-ae6e57b9f3c5","Type":"ContainerDied","Data":"58236e2c6979a5918eb97d46f7b00e1a0fb941a6dc4f318c870bdfe32b13a275"} Oct 04 05:00:57 crc kubenswrapper[4574]: I1004 05:00:57.488301 4574 generic.go:334] "Generic (PLEG): container finished" podID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerID="b72aaf2f82e23bc060e0ef5ccce1fa2228f5c337579163044495ba79a6fedffa" exitCode=0 Oct 04 05:00:57 crc kubenswrapper[4574]: I1004 05:00:57.488350 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" event={"ID":"f42d7d5a-0727-4798-96da-ae6e57b9f3c5","Type":"ContainerDied","Data":"b72aaf2f82e23bc060e0ef5ccce1fa2228f5c337579163044495ba79a6fedffa"} Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.720913 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.816892 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-util\") pod \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.816971 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9r96\" (UniqueName: \"kubernetes.io/projected/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-kube-api-access-m9r96\") pod \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.817044 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-bundle\") pod \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\" (UID: \"f42d7d5a-0727-4798-96da-ae6e57b9f3c5\") " Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.818172 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-bundle" (OuterVolumeSpecName: "bundle") pod "f42d7d5a-0727-4798-96da-ae6e57b9f3c5" (UID: "f42d7d5a-0727-4798-96da-ae6e57b9f3c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.824350 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-kube-api-access-m9r96" (OuterVolumeSpecName: "kube-api-access-m9r96") pod "f42d7d5a-0727-4798-96da-ae6e57b9f3c5" (UID: "f42d7d5a-0727-4798-96da-ae6e57b9f3c5"). InnerVolumeSpecName "kube-api-access-m9r96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.834700 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-util" (OuterVolumeSpecName: "util") pod "f42d7d5a-0727-4798-96da-ae6e57b9f3c5" (UID: "f42d7d5a-0727-4798-96da-ae6e57b9f3c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.918475 4574 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-util\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.918515 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9r96\" (UniqueName: \"kubernetes.io/projected/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-kube-api-access-m9r96\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:58 crc kubenswrapper[4574]: I1004 05:00:58.918529 4574 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f42d7d5a-0727-4798-96da-ae6e57b9f3c5-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:00:59 crc kubenswrapper[4574]: I1004 05:00:59.509930 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" event={"ID":"f42d7d5a-0727-4798-96da-ae6e57b9f3c5","Type":"ContainerDied","Data":"818ad103436131a57b5264729f0c90792e2efea902bbc57ce0b0099a8c583bec"} Oct 04 05:00:59 crc kubenswrapper[4574]: I1004 05:00:59.510183 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="818ad103436131a57b5264729f0c90792e2efea902bbc57ce0b0099a8c583bec" Oct 04 05:00:59 crc kubenswrapper[4574]: I1004 05:00:59.509992 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.157751 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp"] Oct 04 05:01:04 crc kubenswrapper[4574]: E1004 05:01:04.158337 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="extract" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.158351 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="extract" Oct 04 05:01:04 crc kubenswrapper[4574]: E1004 05:01:04.158367 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="pull" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.158372 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="pull" Oct 04 05:01:04 crc kubenswrapper[4574]: E1004 05:01:04.158384 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="util" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.158391 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="util" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.158499 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42d7d5a-0727-4798-96da-ae6e57b9f3c5" containerName="extract" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.159123 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.164831 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-hzcjp" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.193697 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp"] Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.288964 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgjsw\" (UniqueName: \"kubernetes.io/projected/6c734153-0dff-4669-ae00-bd91be75e4c6-kube-api-access-kgjsw\") pod \"openstack-operator-controller-operator-76d7b4df79-hsvhp\" (UID: \"6c734153-0dff-4669-ae00-bd91be75e4c6\") " pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.390080 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgjsw\" (UniqueName: \"kubernetes.io/projected/6c734153-0dff-4669-ae00-bd91be75e4c6-kube-api-access-kgjsw\") pod \"openstack-operator-controller-operator-76d7b4df79-hsvhp\" (UID: \"6c734153-0dff-4669-ae00-bd91be75e4c6\") " pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.420803 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgjsw\" (UniqueName: \"kubernetes.io/projected/6c734153-0dff-4669-ae00-bd91be75e4c6-kube-api-access-kgjsw\") pod \"openstack-operator-controller-operator-76d7b4df79-hsvhp\" (UID: \"6c734153-0dff-4669-ae00-bd91be75e4c6\") " pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.480324 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:04 crc kubenswrapper[4574]: I1004 05:01:04.933972 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp"] Oct 04 05:01:05 crc kubenswrapper[4574]: I1004 05:01:05.545415 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" event={"ID":"6c734153-0dff-4669-ae00-bd91be75e4c6","Type":"ContainerStarted","Data":"946f0490b20a1fcf2de8b4531074e281c193832194ee44b04d2abecc1654c01f"} Oct 04 05:01:10 crc kubenswrapper[4574]: I1004 05:01:10.578723 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" event={"ID":"6c734153-0dff-4669-ae00-bd91be75e4c6","Type":"ContainerStarted","Data":"5a115920ae6623d561c448e78793ac9358a4582ffef7d30cea5449e38c025621"} Oct 04 05:01:12 crc kubenswrapper[4574]: I1004 05:01:12.591293 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" event={"ID":"6c734153-0dff-4669-ae00-bd91be75e4c6","Type":"ContainerStarted","Data":"7e8f5c90377804ace3540a947fdcf50d7d4ebedecbdea858d6aa4daa0f664908"} Oct 04 05:01:12 crc kubenswrapper[4574]: I1004 05:01:12.591641 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:12 crc kubenswrapper[4574]: I1004 05:01:12.625170 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" podStartSLOduration=1.6278367249999999 podStartE2EDuration="8.625147321s" podCreationTimestamp="2025-10-04 05:01:04 +0000 UTC" firstStartedPulling="2025-10-04 05:01:04.94773721 +0000 UTC m=+890.801880262" lastFinishedPulling="2025-10-04 05:01:11.945047816 +0000 UTC m=+897.799190858" observedRunningTime="2025-10-04 05:01:12.622991279 +0000 UTC m=+898.477134331" watchObservedRunningTime="2025-10-04 05:01:12.625147321 +0000 UTC m=+898.479290353" Oct 04 05:01:14 crc kubenswrapper[4574]: I1004 05:01:14.483382 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-76d7b4df79-hsvhp" Oct 04 05:01:19 crc kubenswrapper[4574]: I1004 05:01:19.404901 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:01:19 crc kubenswrapper[4574]: I1004 05:01:19.405715 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.350831 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.352462 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.358247 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jtk57" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.362518 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.363701 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.368071 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-n67rw" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.371183 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.397836 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.398772 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.402632 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xnh8r" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.423283 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.424422 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.427894 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-d5rhc" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.428140 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.431705 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.462337 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kps4r\" (UniqueName: \"kubernetes.io/projected/9c976366-a9b2-4720-a5ce-2aeffaf0dad2-kube-api-access-kps4r\") pod \"barbican-operator-controller-manager-5f7c849b98-mgwq7\" (UID: \"9c976366-a9b2-4720-a5ce-2aeffaf0dad2\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.462419 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z22d4\" (UniqueName: \"kubernetes.io/projected/4552356b-ed71-465f-beb5-26c4a63dc81d-kube-api-access-z22d4\") pod \"cinder-operator-controller-manager-7d4d4f8d-9t5xx\" (UID: \"4552356b-ed71-465f-beb5-26c4a63dc81d\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.472590 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.496517 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.496659 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.501183 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hlqkt" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.508223 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.536130 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.537222 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.544763 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9hjtv" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.554026 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.563279 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq78l\" (UniqueName: \"kubernetes.io/projected/52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd-kube-api-access-cq78l\") pod \"glance-operator-controller-manager-5568b5d68-pmvc8\" (UID: \"52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.563326 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6vx\" (UniqueName: \"kubernetes.io/projected/d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7-kube-api-access-fl6vx\") pod \"heat-operator-controller-manager-8f58bc9db-mdh2j\" (UID: \"d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.563353 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9fv\" (UniqueName: \"kubernetes.io/projected/39766d86-7ab2-42ca-b6ae-0e02eb871cc3-kube-api-access-wn9fv\") pod \"designate-operator-controller-manager-75dfd9b554-qbzx8\" (UID: \"39766d86-7ab2-42ca-b6ae-0e02eb871cc3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.563493 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z22d4\" (UniqueName: \"kubernetes.io/projected/4552356b-ed71-465f-beb5-26c4a63dc81d-kube-api-access-z22d4\") pod \"cinder-operator-controller-manager-7d4d4f8d-9t5xx\" (UID: \"4552356b-ed71-465f-beb5-26c4a63dc81d\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.563710 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kps4r\" (UniqueName: \"kubernetes.io/projected/9c976366-a9b2-4720-a5ce-2aeffaf0dad2-kube-api-access-kps4r\") pod \"barbican-operator-controller-manager-5f7c849b98-mgwq7\" (UID: \"9c976366-a9b2-4720-a5ce-2aeffaf0dad2\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.584999 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.586187 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.589845 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.589903 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4nwst" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.630334 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z22d4\" (UniqueName: \"kubernetes.io/projected/4552356b-ed71-465f-beb5-26c4a63dc81d-kube-api-access-z22d4\") pod \"cinder-operator-controller-manager-7d4d4f8d-9t5xx\" (UID: \"4552356b-ed71-465f-beb5-26c4a63dc81d\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.646638 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kps4r\" (UniqueName: \"kubernetes.io/projected/9c976366-a9b2-4720-a5ce-2aeffaf0dad2-kube-api-access-kps4r\") pod \"barbican-operator-controller-manager-5f7c849b98-mgwq7\" (UID: \"9c976366-a9b2-4720-a5ce-2aeffaf0dad2\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.624866 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.663521 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.663700 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.665927 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnf5h\" (UniqueName: \"kubernetes.io/projected/d552b4e4-9120-4d96-8615-fa6d68a71042-kube-api-access-xnf5h\") pod \"horizon-operator-controller-manager-54876c876f-96hsk\" (UID: \"d552b4e4-9120-4d96-8615-fa6d68a71042\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.665995 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5rh\" (UniqueName: \"kubernetes.io/projected/e288039e-c6d3-4911-b284-1eb1cd2bccf2-kube-api-access-wm5rh\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.666026 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.666098 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq78l\" (UniqueName: \"kubernetes.io/projected/52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd-kube-api-access-cq78l\") pod \"glance-operator-controller-manager-5568b5d68-pmvc8\" (UID: \"52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.666123 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6vx\" (UniqueName: \"kubernetes.io/projected/d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7-kube-api-access-fl6vx\") pod \"heat-operator-controller-manager-8f58bc9db-mdh2j\" (UID: \"d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.666154 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9fv\" (UniqueName: \"kubernetes.io/projected/39766d86-7ab2-42ca-b6ae-0e02eb871cc3-kube-api-access-wn9fv\") pod \"designate-operator-controller-manager-75dfd9b554-qbzx8\" (UID: \"39766d86-7ab2-42ca-b6ae-0e02eb871cc3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.666262 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.672665 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jh9lf" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.672858 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zgggv" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.678047 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.679870 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.689673 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.691751 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.706088 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq78l\" (UniqueName: \"kubernetes.io/projected/52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd-kube-api-access-cq78l\") pod \"glance-operator-controller-manager-5568b5d68-pmvc8\" (UID: \"52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.706164 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.714282 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.716194 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9fv\" (UniqueName: \"kubernetes.io/projected/39766d86-7ab2-42ca-b6ae-0e02eb871cc3-kube-api-access-wn9fv\") pod \"designate-operator-controller-manager-75dfd9b554-qbzx8\" (UID: \"39766d86-7ab2-42ca-b6ae-0e02eb871cc3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.718309 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.729378 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fz92t" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.748617 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.750171 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.758547 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6vx\" (UniqueName: \"kubernetes.io/projected/d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7-kube-api-access-fl6vx\") pod \"heat-operator-controller-manager-8f58bc9db-mdh2j\" (UID: \"d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.772843 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-s27mx" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.787137 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.788852 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plsb2\" (UniqueName: \"kubernetes.io/projected/d4f548d4-c2a0-4756-a55a-3d398b81d923-kube-api-access-plsb2\") pod \"ironic-operator-controller-manager-699b87f775-xcjwv\" (UID: \"d4f548d4-c2a0-4756-a55a-3d398b81d923\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.788980 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnf5h\" (UniqueName: \"kubernetes.io/projected/d552b4e4-9120-4d96-8615-fa6d68a71042-kube-api-access-xnf5h\") pod \"horizon-operator-controller-manager-54876c876f-96hsk\" (UID: \"d552b4e4-9120-4d96-8615-fa6d68a71042\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.789043 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkwl\" (UniqueName: \"kubernetes.io/projected/55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4-kube-api-access-fbkwl\") pod \"keystone-operator-controller-manager-7c777dc986-cvjnd\" (UID: \"55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4\") " pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.789078 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5rh\" (UniqueName: \"kubernetes.io/projected/e288039e-c6d3-4911-b284-1eb1cd2bccf2-kube-api-access-wm5rh\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.789147 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:31 crc kubenswrapper[4574]: E1004 05:01:31.789301 4574 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 04 05:01:31 crc kubenswrapper[4574]: E1004 05:01:31.789377 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert podName:e288039e-c6d3-4911-b284-1eb1cd2bccf2 nodeName:}" failed. No retries permitted until 2025-10-04 05:01:32.289355807 +0000 UTC m=+918.143498849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert") pod "infra-operator-controller-manager-658588b8c9-gnpjd" (UID: "e288039e-c6d3-4911-b284-1eb1cd2bccf2") : secret "infra-operator-webhook-server-cert" not found Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.814328 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.829334 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.842677 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.860550 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnf5h\" (UniqueName: \"kubernetes.io/projected/d552b4e4-9120-4d96-8615-fa6d68a71042-kube-api-access-xnf5h\") pod \"horizon-operator-controller-manager-54876c876f-96hsk\" (UID: \"d552b4e4-9120-4d96-8615-fa6d68a71042\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.863893 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5rh\" (UniqueName: \"kubernetes.io/projected/e288039e-c6d3-4911-b284-1eb1cd2bccf2-kube-api-access-wm5rh\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.888398 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.889625 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.890783 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plsb2\" (UniqueName: \"kubernetes.io/projected/d4f548d4-c2a0-4756-a55a-3d398b81d923-kube-api-access-plsb2\") pod \"ironic-operator-controller-manager-699b87f775-xcjwv\" (UID: \"d4f548d4-c2a0-4756-a55a-3d398b81d923\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.890822 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6v2\" (UniqueName: \"kubernetes.io/projected/85b1921d-1572-4aff-b002-2f31c2f270b4-kube-api-access-mj6v2\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j\" (UID: \"85b1921d-1572-4aff-b002-2f31c2f270b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.890857 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8lsd\" (UniqueName: \"kubernetes.io/projected/1edbf723-752f-416b-a922-12a73521d6f9-kube-api-access-g8lsd\") pod \"manila-operator-controller-manager-65d89cfd9f-88mfj\" (UID: \"1edbf723-752f-416b-a922-12a73521d6f9\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.890889 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbkwl\" (UniqueName: \"kubernetes.io/projected/55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4-kube-api-access-fbkwl\") pod \"keystone-operator-controller-manager-7c777dc986-cvjnd\" (UID: \"55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4\") " pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.900342 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6gdnz" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.909828 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.910991 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.927526 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mvkjj" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.931915 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbkwl\" (UniqueName: \"kubernetes.io/projected/55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4-kube-api-access-fbkwl\") pod \"keystone-operator-controller-manager-7c777dc986-cvjnd\" (UID: \"55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4\") " pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.954326 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.963075 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plsb2\" (UniqueName: \"kubernetes.io/projected/d4f548d4-c2a0-4756-a55a-3d398b81d923-kube-api-access-plsb2\") pod \"ironic-operator-controller-manager-699b87f775-xcjwv\" (UID: \"d4f548d4-c2a0-4756-a55a-3d398b81d923\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.989154 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t"] Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.994698 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6v2\" (UniqueName: \"kubernetes.io/projected/85b1921d-1572-4aff-b002-2f31c2f270b4-kube-api-access-mj6v2\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j\" (UID: \"85b1921d-1572-4aff-b002-2f31c2f270b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.994762 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc72d\" (UniqueName: \"kubernetes.io/projected/95f9af94-f839-464f-8c6f-8928146b0d26-kube-api-access-mc72d\") pod \"nova-operator-controller-manager-7c7fc454ff-t222j\" (UID: \"95f9af94-f839-464f-8c6f-8928146b0d26\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.994814 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5v5m\" (UniqueName: \"kubernetes.io/projected/90b04996-9e73-45c9-a03c-59e4bedf4ff4-kube-api-access-b5v5m\") pod \"neutron-operator-controller-manager-8d984cc4d-jt72t\" (UID: \"90b04996-9e73-45c9-a03c-59e4bedf4ff4\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:01:31 crc kubenswrapper[4574]: I1004 05:01:31.994848 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8lsd\" (UniqueName: \"kubernetes.io/projected/1edbf723-752f-416b-a922-12a73521d6f9-kube-api-access-g8lsd\") pod \"manila-operator-controller-manager-65d89cfd9f-88mfj\" (UID: \"1edbf723-752f-416b-a922-12a73521d6f9\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.024682 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.025928 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.027016 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.041552 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-m2hfd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.046170 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8lsd\" (UniqueName: \"kubernetes.io/projected/1edbf723-752f-416b-a922-12a73521d6f9-kube-api-access-g8lsd\") pod \"manila-operator-controller-manager-65d89cfd9f-88mfj\" (UID: \"1edbf723-752f-416b-a922-12a73521d6f9\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.075272 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.086607 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6v2\" (UniqueName: \"kubernetes.io/projected/85b1921d-1572-4aff-b002-2f31c2f270b4-kube-api-access-mj6v2\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j\" (UID: \"85b1921d-1572-4aff-b002-2f31c2f270b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.095267 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.117225 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.097890 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc72d\" (UniqueName: \"kubernetes.io/projected/95f9af94-f839-464f-8c6f-8928146b0d26-kube-api-access-mc72d\") pod \"nova-operator-controller-manager-7c7fc454ff-t222j\" (UID: \"95f9af94-f839-464f-8c6f-8928146b0d26\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.117784 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5v5m\" (UniqueName: \"kubernetes.io/projected/90b04996-9e73-45c9-a03c-59e4bedf4ff4-kube-api-access-b5v5m\") pod \"neutron-operator-controller-manager-8d984cc4d-jt72t\" (UID: \"90b04996-9e73-45c9-a03c-59e4bedf4ff4\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.118315 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.099203 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.120625 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.122588 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.123853 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-krlzk" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.125338 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.147774 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.151325 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wjgqq" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.151623 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.164810 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-scvnn" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.165655 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.166484 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.179627 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.181683 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.189542 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc72d\" (UniqueName: \"kubernetes.io/projected/95f9af94-f839-464f-8c6f-8928146b0d26-kube-api-access-mc72d\") pod \"nova-operator-controller-manager-7c7fc454ff-t222j\" (UID: \"95f9af94-f839-464f-8c6f-8928146b0d26\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.195410 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.222227 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5v5m\" (UniqueName: \"kubernetes.io/projected/90b04996-9e73-45c9-a03c-59e4bedf4ff4-kube-api-access-b5v5m\") pod \"neutron-operator-controller-manager-8d984cc4d-jt72t\" (UID: \"90b04996-9e73-45c9-a03c-59e4bedf4ff4\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.223381 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szqdm\" (UniqueName: \"kubernetes.io/projected/54443166-57a5-4e11-914c-d9cb2f3252d7-kube-api-access-szqdm\") pod \"octavia-operator-controller-manager-7468f855d8-g2kpz\" (UID: \"54443166-57a5-4e11-914c-d9cb2f3252d7\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.223432 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8rq\" (UniqueName: \"kubernetes.io/projected/46bd489f-f708-4c7e-b697-39e9fd65a30e-kube-api-access-dv8rq\") pod \"ovn-operator-controller-manager-579449c7d5-sxfrz\" (UID: \"46bd489f-f708-4c7e-b697-39e9fd65a30e\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.223459 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b7b141-c133-4487-9ecb-fab0b12d82bb-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.223624 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svtn\" (UniqueName: \"kubernetes.io/projected/28570522-1dff-475f-8ab0-963f4ac14534-kube-api-access-6svtn\") pod \"placement-operator-controller-manager-54689d9f88-b4fbd\" (UID: \"28570522-1dff-475f-8ab0-963f4ac14534\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.223732 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tcfx\" (UniqueName: \"kubernetes.io/projected/f0b7b141-c133-4487-9ecb-fab0b12d82bb-kube-api-access-5tcfx\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.238697 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.240371 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.241345 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.253596 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.254531 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.255678 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.270122 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xnkwn" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.270919 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lhxqc" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.276959 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.312599 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.327574 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328499 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsj5s\" (UniqueName: \"kubernetes.io/projected/e227d829-9a02-40dd-b0c5-012a7d024253-kube-api-access-dsj5s\") pod \"swift-operator-controller-manager-6859f9b676-2fzvp\" (UID: \"e227d829-9a02-40dd-b0c5-012a7d024253\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328580 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328638 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szqdm\" (UniqueName: \"kubernetes.io/projected/54443166-57a5-4e11-914c-d9cb2f3252d7-kube-api-access-szqdm\") pod \"octavia-operator-controller-manager-7468f855d8-g2kpz\" (UID: \"54443166-57a5-4e11-914c-d9cb2f3252d7\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328688 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8rq\" (UniqueName: \"kubernetes.io/projected/46bd489f-f708-4c7e-b697-39e9fd65a30e-kube-api-access-dv8rq\") pod \"ovn-operator-controller-manager-579449c7d5-sxfrz\" (UID: \"46bd489f-f708-4c7e-b697-39e9fd65a30e\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328713 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b7b141-c133-4487-9ecb-fab0b12d82bb-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328763 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svtn\" (UniqueName: \"kubernetes.io/projected/28570522-1dff-475f-8ab0-963f4ac14534-kube-api-access-6svtn\") pod \"placement-operator-controller-manager-54689d9f88-b4fbd\" (UID: \"28570522-1dff-475f-8ab0-963f4ac14534\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.328802 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tcfx\" (UniqueName: \"kubernetes.io/projected/f0b7b141-c133-4487-9ecb-fab0b12d82bb-kube-api-access-5tcfx\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: E1004 05:01:32.329185 4574 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 04 05:01:32 crc kubenswrapper[4574]: E1004 05:01:32.329258 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert podName:e288039e-c6d3-4911-b284-1eb1cd2bccf2 nodeName:}" failed. No retries permitted until 2025-10-04 05:01:33.329219559 +0000 UTC m=+919.183362601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert") pod "infra-operator-controller-manager-658588b8c9-gnpjd" (UID: "e288039e-c6d3-4911-b284-1eb1cd2bccf2") : secret "infra-operator-webhook-server-cert" not found Oct 04 05:01:32 crc kubenswrapper[4574]: E1004 05:01:32.329818 4574 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 04 05:01:32 crc kubenswrapper[4574]: E1004 05:01:32.329855 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b7b141-c133-4487-9ecb-fab0b12d82bb-cert podName:f0b7b141-c133-4487-9ecb-fab0b12d82bb nodeName:}" failed. No retries permitted until 2025-10-04 05:01:32.829844448 +0000 UTC m=+918.683987490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b7b141-c133-4487-9ecb-fab0b12d82bb-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" (UID: "f0b7b141-c133-4487-9ecb-fab0b12d82bb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.347726 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.349210 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.370116 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.371509 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.374139 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xr4hz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.377668 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-48nxx" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.395115 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svtn\" (UniqueName: \"kubernetes.io/projected/28570522-1dff-475f-8ab0-963f4ac14534-kube-api-access-6svtn\") pod \"placement-operator-controller-manager-54689d9f88-b4fbd\" (UID: \"28570522-1dff-475f-8ab0-963f4ac14534\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.395521 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.406185 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8rq\" (UniqueName: \"kubernetes.io/projected/46bd489f-f708-4c7e-b697-39e9fd65a30e-kube-api-access-dv8rq\") pod \"ovn-operator-controller-manager-579449c7d5-sxfrz\" (UID: \"46bd489f-f708-4c7e-b697-39e9fd65a30e\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.406318 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.413710 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tcfx\" (UniqueName: \"kubernetes.io/projected/f0b7b141-c133-4487-9ecb-fab0b12d82bb-kube-api-access-5tcfx\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.415827 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szqdm\" (UniqueName: \"kubernetes.io/projected/54443166-57a5-4e11-914c-d9cb2f3252d7-kube-api-access-szqdm\") pod \"octavia-operator-controller-manager-7468f855d8-g2kpz\" (UID: \"54443166-57a5-4e11-914c-d9cb2f3252d7\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.435145 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ln4\" (UniqueName: \"kubernetes.io/projected/60dfec70-f10c-4d73-9933-f2cb76124090-kube-api-access-q4ln4\") pod \"telemetry-operator-controller-manager-5d4d74dd89-hfm8z\" (UID: \"60dfec70-f10c-4d73-9933-f2cb76124090\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.435375 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8v9\" (UniqueName: \"kubernetes.io/projected/f87750ff-5d28-4658-b7d4-bc49bcb35886-kube-api-access-nl8v9\") pod \"test-operator-controller-manager-5cd5cb47d7-mhxlg\" (UID: \"f87750ff-5d28-4658-b7d4-bc49bcb35886\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.435597 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsj5s\" (UniqueName: \"kubernetes.io/projected/e227d829-9a02-40dd-b0c5-012a7d024253-kube-api-access-dsj5s\") pod \"swift-operator-controller-manager-6859f9b676-2fzvp\" (UID: \"e227d829-9a02-40dd-b0c5-012a7d024253\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.450435 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.507035 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsj5s\" (UniqueName: \"kubernetes.io/projected/e227d829-9a02-40dd-b0c5-012a7d024253-kube-api-access-dsj5s\") pod \"swift-operator-controller-manager-6859f9b676-2fzvp\" (UID: \"e227d829-9a02-40dd-b0c5-012a7d024253\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.537220 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959rz\" (UniqueName: \"kubernetes.io/projected/cb68cf9f-4ba2-410a-85f7-1db627311ff6-kube-api-access-959rz\") pod \"watcher-operator-controller-manager-6cbc6dd547-llj5f\" (UID: \"cb68cf9f-4ba2-410a-85f7-1db627311ff6\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.537730 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ln4\" (UniqueName: \"kubernetes.io/projected/60dfec70-f10c-4d73-9933-f2cb76124090-kube-api-access-q4ln4\") pod \"telemetry-operator-controller-manager-5d4d74dd89-hfm8z\" (UID: \"60dfec70-f10c-4d73-9933-f2cb76124090\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.537857 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8v9\" (UniqueName: \"kubernetes.io/projected/f87750ff-5d28-4658-b7d4-bc49bcb35886-kube-api-access-nl8v9\") pod \"test-operator-controller-manager-5cd5cb47d7-mhxlg\" (UID: \"f87750ff-5d28-4658-b7d4-bc49bcb35886\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.560746 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8v9\" (UniqueName: \"kubernetes.io/projected/f87750ff-5d28-4658-b7d4-bc49bcb35886-kube-api-access-nl8v9\") pod \"test-operator-controller-manager-5cd5cb47d7-mhxlg\" (UID: \"f87750ff-5d28-4658-b7d4-bc49bcb35886\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.595156 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ln4\" (UniqueName: \"kubernetes.io/projected/60dfec70-f10c-4d73-9933-f2cb76124090-kube-api-access-q4ln4\") pod \"telemetry-operator-controller-manager-5d4d74dd89-hfm8z\" (UID: \"60dfec70-f10c-4d73-9933-f2cb76124090\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.596562 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.601841 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.610308 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-gngcq" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.610518 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.629724 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.642951 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959rz\" (UniqueName: \"kubernetes.io/projected/cb68cf9f-4ba2-410a-85f7-1db627311ff6-kube-api-access-959rz\") pod \"watcher-operator-controller-manager-6cbc6dd547-llj5f\" (UID: \"cb68cf9f-4ba2-410a-85f7-1db627311ff6\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.649452 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.682093 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.687083 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959rz\" (UniqueName: \"kubernetes.io/projected/cb68cf9f-4ba2-410a-85f7-1db627311ff6-kube-api-access-959rz\") pod \"watcher-operator-controller-manager-6cbc6dd547-llj5f\" (UID: \"cb68cf9f-4ba2-410a-85f7-1db627311ff6\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.717116 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.720684 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.744903 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9169e6bf-53d3-420e-bb99-b9d897653612-cert\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.744959 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b77fr\" (UniqueName: \"kubernetes.io/projected/9169e6bf-53d3-420e-bb99-b9d897653612-kube-api-access-b77fr\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.776159 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.819533 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.851448 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b77fr\" (UniqueName: \"kubernetes.io/projected/9169e6bf-53d3-420e-bb99-b9d897653612-kube-api-access-b77fr\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.854956 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b7b141-c133-4487-9ecb-fab0b12d82bb-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.855182 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9169e6bf-53d3-420e-bb99-b9d897653612-cert\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:32 crc kubenswrapper[4574]: E1004 05:01:32.855390 4574 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 04 05:01:32 crc kubenswrapper[4574]: E1004 05:01:32.855464 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9169e6bf-53d3-420e-bb99-b9d897653612-cert podName:9169e6bf-53d3-420e-bb99-b9d897653612 nodeName:}" failed. No retries permitted until 2025-10-04 05:01:33.355445075 +0000 UTC m=+919.209588117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9169e6bf-53d3-420e-bb99-b9d897653612-cert") pod "openstack-operator-controller-manager-8fff4c848-5cvwf" (UID: "9169e6bf-53d3-420e-bb99-b9d897653612") : secret "webhook-server-cert" not found Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.873643 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.874704 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2"] Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.874813 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.890617 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b7b141-c133-4487-9ecb-fab0b12d82bb-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cz7492\" (UID: \"f0b7b141-c133-4487-9ecb-fab0b12d82bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.895366 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zb29l" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.896226 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b77fr\" (UniqueName: \"kubernetes.io/projected/9169e6bf-53d3-420e-bb99-b9d897653612-kube-api-access-b77fr\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.897433 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:32 crc kubenswrapper[4574]: I1004 05:01:32.967508 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjm4\" (UniqueName: \"kubernetes.io/projected/a95cec28-a993-4f56-b540-18ad84c5bd2d-kube-api-access-nrjm4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2\" (UID: \"a95cec28-a993-4f56-b540-18ad84c5bd2d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.069774 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjm4\" (UniqueName: \"kubernetes.io/projected/a95cec28-a993-4f56-b540-18ad84c5bd2d-kube-api-access-nrjm4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2\" (UID: \"a95cec28-a993-4f56-b540-18ad84c5bd2d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.096279 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjm4\" (UniqueName: \"kubernetes.io/projected/a95cec28-a993-4f56-b540-18ad84c5bd2d-kube-api-access-nrjm4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2\" (UID: \"a95cec28-a993-4f56-b540-18ad84c5bd2d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.175768 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.211850 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.250056 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.325439 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.343924 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j"] Oct 04 05:01:33 crc kubenswrapper[4574]: W1004 05:01:33.363560 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f472c8_8d6c_46f0_bed2_ff2b19f3fcf7.slice/crio-f293c0f8e939ef189dbeff67b982507d0ae4ebec41868910fc6fb0774e9aa56c WatchSource:0}: Error finding container f293c0f8e939ef189dbeff67b982507d0ae4ebec41868910fc6fb0774e9aa56c: Status 404 returned error can't find the container with id f293c0f8e939ef189dbeff67b982507d0ae4ebec41868910fc6fb0774e9aa56c Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.399901 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.399935 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9169e6bf-53d3-420e-bb99-b9d897653612-cert\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.400061 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.422353 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e288039e-c6d3-4911-b284-1eb1cd2bccf2-cert\") pod \"infra-operator-controller-manager-658588b8c9-gnpjd\" (UID: \"e288039e-c6d3-4911-b284-1eb1cd2bccf2\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.427354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9169e6bf-53d3-420e-bb99-b9d897653612-cert\") pod \"openstack-operator-controller-manager-8fff4c848-5cvwf\" (UID: \"9169e6bf-53d3-420e-bb99-b9d897653612\") " pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.427887 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.659613 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.757649 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" event={"ID":"39766d86-7ab2-42ca-b6ae-0e02eb871cc3","Type":"ContainerStarted","Data":"b3a90455b6fee444a2e9823021410e5f6a08e763ecffe8de4d27165a668d1da5"} Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.757786 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.758660 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" event={"ID":"4552356b-ed71-465f-beb5-26c4a63dc81d","Type":"ContainerStarted","Data":"541fa7bda66d45a0fd4bcabc22b866ad292d8ecc443f51a0567f9f23fccdbaec"} Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.759540 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" event={"ID":"9c976366-a9b2-4720-a5ce-2aeffaf0dad2","Type":"ContainerStarted","Data":"6f056e6b398d9a4d8f07754fd0715b12e7807f5babb7e94b2940eee309e97f47"} Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.761339 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" event={"ID":"d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7","Type":"ContainerStarted","Data":"f293c0f8e939ef189dbeff67b982507d0ae4ebec41868910fc6fb0774e9aa56c"} Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.763749 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" event={"ID":"52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd","Type":"ContainerStarted","Data":"0bb6978aa065db8255bddc42959617aa79f5b3b691602e60c61bbb00e24b15d9"} Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.811641 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.827158 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.840279 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.855483 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd"] Oct 04 05:01:33 crc kubenswrapper[4574]: W1004 05:01:33.862879 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1edbf723_752f_416b_a922_12a73521d6f9.slice/crio-abd3b36629e71b08c04c0dfc0ba365cfae9f877dccfe0a8ce3d3dad1ffbe3398 WatchSource:0}: Error finding container abd3b36629e71b08c04c0dfc0ba365cfae9f877dccfe0a8ce3d3dad1ffbe3398: Status 404 returned error can't find the container with id abd3b36629e71b08c04c0dfc0ba365cfae9f877dccfe0a8ce3d3dad1ffbe3398 Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.871147 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk"] Oct 04 05:01:33 crc kubenswrapper[4574]: I1004 05:01:33.879834 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.080298 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.117044 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.142281 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.157077 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz"] Oct 04 05:01:34 crc kubenswrapper[4574]: W1004 05:01:34.167917 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c14b8b_0e39_40a8_8f1c_9eefffe0f3a4.slice/crio-5a6ef84483ef4940db42f1c5f8144112287353ae05fb5771618282d8af1df4fb WatchSource:0}: Error finding container 5a6ef84483ef4940db42f1c5f8144112287353ae05fb5771618282d8af1df4fb: Status 404 returned error can't find the container with id 5a6ef84483ef4940db42f1c5f8144112287353ae05fb5771618282d8af1df4fb Oct 04 05:01:34 crc kubenswrapper[4574]: W1004 05:01:34.191610 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54443166_57a5_4e11_914c_d9cb2f3252d7.slice/crio-629a89277c77dcafe4bcdaf416a1e1749684080f5d87a8332ec406f9e5046f1b WatchSource:0}: Error finding container 629a89277c77dcafe4bcdaf416a1e1749684080f5d87a8332ec406f9e5046f1b: Status 404 returned error can't find the container with id 629a89277c77dcafe4bcdaf416a1e1749684080f5d87a8332ec406f9e5046f1b Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.199367 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-szqdm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-g2kpz_openstack-operators(54443166-57a5-4e11-914c-d9cb2f3252d7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 05:01:34 crc kubenswrapper[4574]: W1004 05:01:34.199995 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60dfec70_f10c_4d73_9933_f2cb76124090.slice/crio-efbe6c97cf5ee024d0d6aa4fe3297d584bbf3cf3ae705b693797c13512575399 WatchSource:0}: Error finding container efbe6c97cf5ee024d0d6aa4fe3297d584bbf3cf3ae705b693797c13512575399: Status 404 returned error can't find the container with id efbe6c97cf5ee024d0d6aa4fe3297d584bbf3cf3ae705b693797c13512575399 Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.377151 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.390988 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp"] Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.430194 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dsj5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-2fzvp_openstack-operators(e227d829-9a02-40dd-b0c5-012a7d024253): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.432661 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nl8v9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-mhxlg_openstack-operators(f87750ff-5d28-4658-b7d4-bc49bcb35886): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.447698 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f"] Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.453044 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrjm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2_openstack-operators(a95cec28-a993-4f56-b540-18ad84c5bd2d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.454202 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" podUID="a95cec28-a993-4f56-b540-18ad84c5bd2d" Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.457015 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5tcfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665cz7492_openstack-operators(f0b7b141-c133-4487-9ecb-fab0b12d82bb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.467441 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.474177 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492"] Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.484459 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd"] Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.485757 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" podUID="54443166-57a5-4e11-914c-d9cb2f3252d7" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.502945 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf"] Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.503779 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wm5rh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-gnpjd_openstack-operators(e288039e-c6d3-4911-b284-1eb1cd2bccf2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.747623 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" podUID="f87750ff-5d28-4658-b7d4-bc49bcb35886" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.780587 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" event={"ID":"e288039e-c6d3-4911-b284-1eb1cd2bccf2","Type":"ContainerStarted","Data":"b802d37574772a3b242390b6330ad19bdb1953a56f5a2603b6160cfcd65c939d"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.796586 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" event={"ID":"90b04996-9e73-45c9-a03c-59e4bedf4ff4","Type":"ContainerStarted","Data":"8b4bcfab9c1b6d55e3aeffe36f4d4acfb8c988a0747fa83f62c3c602d04c4d9f"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.801590 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" event={"ID":"f0b7b141-c133-4487-9ecb-fab0b12d82bb","Type":"ContainerStarted","Data":"859f39418e6557eed278e6a6bd12ed84e9c0873a10e262f4d219a53836ff0c14"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.814163 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" event={"ID":"60dfec70-f10c-4d73-9933-f2cb76124090","Type":"ContainerStarted","Data":"efbe6c97cf5ee024d0d6aa4fe3297d584bbf3cf3ae705b693797c13512575399"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.816484 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" event={"ID":"d552b4e4-9120-4d96-8615-fa6d68a71042","Type":"ContainerStarted","Data":"b66de9ec6adde6068f93e6bd351e7b7672b788488766afe3cb83259b3c57e4fb"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.831507 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" event={"ID":"54443166-57a5-4e11-914c-d9cb2f3252d7","Type":"ContainerStarted","Data":"7d388aea61903e35a5e96c014f7e65516496bcfbf483690a85b0ebcc33877df6"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.831553 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" event={"ID":"54443166-57a5-4e11-914c-d9cb2f3252d7","Type":"ContainerStarted","Data":"629a89277c77dcafe4bcdaf416a1e1749684080f5d87a8332ec406f9e5046f1b"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.834695 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" event={"ID":"a95cec28-a993-4f56-b540-18ad84c5bd2d","Type":"ContainerStarted","Data":"4ceb36668c61613d8311a56e4ed084a0449bbb83cdc3ab6e65fb3d090f4d0402"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.836448 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" event={"ID":"28570522-1dff-475f-8ab0-963f4ac14534","Type":"ContainerStarted","Data":"6b24dc2fa4725aa94e4723ce75e9780f8ed22d14b63a2056e126d1aec43ca486"} Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.837091 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" podUID="a95cec28-a993-4f56-b540-18ad84c5bd2d" Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.837524 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" podUID="54443166-57a5-4e11-914c-d9cb2f3252d7" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.852122 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" event={"ID":"e227d829-9a02-40dd-b0c5-012a7d024253","Type":"ContainerStarted","Data":"1f600e3c2de6888d2b687b92849713bb14742c55ef3a649acec0550f972077d4"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.865429 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" event={"ID":"d4f548d4-c2a0-4756-a55a-3d398b81d923","Type":"ContainerStarted","Data":"40952b53e65dafdb1c64c6fea49ecb4b576725f2faab0f96fa4e4f1344098e66"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.869587 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" event={"ID":"55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4","Type":"ContainerStarted","Data":"5a6ef84483ef4940db42f1c5f8144112287353ae05fb5771618282d8af1df4fb"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.878175 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" event={"ID":"46bd489f-f708-4c7e-b697-39e9fd65a30e","Type":"ContainerStarted","Data":"a3eab10e4d7aaa6944aac71222d231fce2ff43f63bdb0cbc36cee44b0c267f8a"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.883663 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" event={"ID":"f87750ff-5d28-4658-b7d4-bc49bcb35886","Type":"ContainerStarted","Data":"ceb7adba719ebe4feb524908f83c32737ba1049597ef58d645303356c89bdb2c"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.883979 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" event={"ID":"f87750ff-5d28-4658-b7d4-bc49bcb35886","Type":"ContainerStarted","Data":"e895d42138bcd367cb80816b68933ac239ef62509f53904ef57432827d2c63bc"} Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.888031 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" podUID="f87750ff-5d28-4658-b7d4-bc49bcb35886" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.913920 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" event={"ID":"95f9af94-f839-464f-8c6f-8928146b0d26","Type":"ContainerStarted","Data":"64b0fa6b099a02b21f9aacf357fcdedbc68c1f97bb8b570053919548f1591dc4"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.934151 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" event={"ID":"cb68cf9f-4ba2-410a-85f7-1db627311ff6","Type":"ContainerStarted","Data":"769fd9abb1eb36a2990c3bfdfd11e7b44f9167d5928f53320a3507068df9aa68"} Oct 04 05:01:34 crc kubenswrapper[4574]: E1004 05:01:34.951152 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" podUID="e227d829-9a02-40dd-b0c5-012a7d024253" Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.956342 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" event={"ID":"85b1921d-1572-4aff-b002-2f31c2f270b4","Type":"ContainerStarted","Data":"2f53ef974d06d89d279c81038cb0e095e881bec2ab4ca7338c102c0385744e79"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.966442 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" event={"ID":"1edbf723-752f-416b-a922-12a73521d6f9","Type":"ContainerStarted","Data":"abd3b36629e71b08c04c0dfc0ba365cfae9f877dccfe0a8ce3d3dad1ffbe3398"} Oct 04 05:01:34 crc kubenswrapper[4574]: I1004 05:01:34.992482 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" event={"ID":"9169e6bf-53d3-420e-bb99-b9d897653612","Type":"ContainerStarted","Data":"cf7619e85384d580fa1905149cff0f277a9b4957f53545104b048f0ae8392312"} Oct 04 05:01:35 crc kubenswrapper[4574]: E1004 05:01:35.082153 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" podUID="e288039e-c6d3-4911-b284-1eb1cd2bccf2" Oct 04 05:01:35 crc kubenswrapper[4574]: E1004 05:01:35.158057 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" podUID="f0b7b141-c133-4487-9ecb-fab0b12d82bb" Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.051025 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" event={"ID":"e288039e-c6d3-4911-b284-1eb1cd2bccf2","Type":"ContainerStarted","Data":"5b81e8053b276272b0a8296a3ce1c85660533ec323c376af9da63bab03e02289"} Oct 04 05:01:36 crc kubenswrapper[4574]: E1004 05:01:36.053536 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" podUID="e288039e-c6d3-4911-b284-1eb1cd2bccf2" Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.058480 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" event={"ID":"f0b7b141-c133-4487-9ecb-fab0b12d82bb","Type":"ContainerStarted","Data":"3d64f3db84c9409ead87bb0608d597df2232c104ff58259dbe2566ff03aa12ef"} Oct 04 05:01:36 crc kubenswrapper[4574]: E1004 05:01:36.062558 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" podUID="f0b7b141-c133-4487-9ecb-fab0b12d82bb" Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.068022 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" event={"ID":"9169e6bf-53d3-420e-bb99-b9d897653612","Type":"ContainerStarted","Data":"5d17ee4cb6f8260f72aa95fb2ae54f43916e83e98f314e43846cbf296a034b73"} Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.068071 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" event={"ID":"9169e6bf-53d3-420e-bb99-b9d897653612","Type":"ContainerStarted","Data":"47855849bdae996c7693bbc6ebe36f6ec1a230800a97908e24ee9493ae127dda"} Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.068830 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.074961 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" event={"ID":"e227d829-9a02-40dd-b0c5-012a7d024253","Type":"ContainerStarted","Data":"0522d20b8c42390f15c5df199ecdaf384f0baa3f404e9ace4559366a68d4e239"} Oct 04 05:01:36 crc kubenswrapper[4574]: E1004 05:01:36.081879 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" podUID="54443166-57a5-4e11-914c-d9cb2f3252d7" Oct 04 05:01:36 crc kubenswrapper[4574]: E1004 05:01:36.091535 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" podUID="a95cec28-a993-4f56-b540-18ad84c5bd2d" Oct 04 05:01:36 crc kubenswrapper[4574]: E1004 05:01:36.091634 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" podUID="e227d829-9a02-40dd-b0c5-012a7d024253" Oct 04 05:01:36 crc kubenswrapper[4574]: E1004 05:01:36.099482 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" podUID="f87750ff-5d28-4658-b7d4-bc49bcb35886" Oct 04 05:01:36 crc kubenswrapper[4574]: I1004 05:01:36.146706 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" podStartSLOduration=4.146676888 podStartE2EDuration="4.146676888s" podCreationTimestamp="2025-10-04 05:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:01:36.140332104 +0000 UTC m=+921.994475146" watchObservedRunningTime="2025-10-04 05:01:36.146676888 +0000 UTC m=+922.000819930" Oct 04 05:01:37 crc kubenswrapper[4574]: E1004 05:01:37.101150 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" podUID="f0b7b141-c133-4487-9ecb-fab0b12d82bb" Oct 04 05:01:37 crc kubenswrapper[4574]: E1004 05:01:37.101438 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" podUID="e288039e-c6d3-4911-b284-1eb1cd2bccf2" Oct 04 05:01:37 crc kubenswrapper[4574]: E1004 05:01:37.101487 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" podUID="e227d829-9a02-40dd-b0c5-012a7d024253" Oct 04 05:01:43 crc kubenswrapper[4574]: I1004 05:01:43.665491 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-8fff4c848-5cvwf" Oct 04 05:01:47 crc kubenswrapper[4574]: E1004 05:01:47.168183 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:585796b996a5b6d7ad68f0cb420bf4f2ee38c9f16f194e3111c162ce91ea8a7b" Oct 04 05:01:47 crc kubenswrapper[4574]: E1004 05:01:47.168889 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:585796b996a5b6d7ad68f0cb420bf4f2ee38c9f16f194e3111c162ce91ea8a7b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wn9fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-75dfd9b554-qbzx8_openstack-operators(39766d86-7ab2-42ca-b6ae-0e02eb871cc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:47 crc kubenswrapper[4574]: E1004 05:01:47.687787 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830" Oct 04 05:01:47 crc kubenswrapper[4574]: E1004 05:01:47.688058 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z22d4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-7d4d4f8d-9t5xx_openstack-operators(4552356b-ed71-465f-beb5-26c4a63dc81d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:48 crc kubenswrapper[4574]: E1004 05:01:48.073767 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1" Oct 04 05:01:48 crc kubenswrapper[4574]: E1004 05:01:48.073965 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6svtn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-b4fbd_openstack-operators(28570522-1dff-475f-8ab0-963f4ac14534): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:49 crc kubenswrapper[4574]: I1004 05:01:49.404912 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:01:49 crc kubenswrapper[4574]: I1004 05:01:49.404979 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:01:49 crc kubenswrapper[4574]: E1004 05:01:49.488107 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe" Oct 04 05:01:49 crc kubenswrapper[4574]: E1004 05:01:49.488413 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mj6v2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j_openstack-operators(85b1921d-1572-4aff-b002-2f31c2f270b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:50 crc kubenswrapper[4574]: E1004 05:01:50.019908 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac" Oct 04 05:01:50 crc kubenswrapper[4574]: E1004 05:01:50.020301 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-plsb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-699b87f775-xcjwv_openstack-operators(d4f548d4-c2a0-4756-a55a-3d398b81d923): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:50 crc kubenswrapper[4574]: E1004 05:01:50.587678 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55" Oct 04 05:01:50 crc kubenswrapper[4574]: E1004 05:01:50.587949 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fl6vx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-8f58bc9db-mdh2j_openstack-operators(d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:50 crc kubenswrapper[4574]: E1004 05:01:50.959276 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6" Oct 04 05:01:50 crc kubenswrapper[4574]: E1004 05:01:50.959464 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-959rz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-llj5f_openstack-operators(cb68cf9f-4ba2-410a-85f7-1db627311ff6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:52 crc kubenswrapper[4574]: E1004 05:01:52.126228 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e" Oct 04 05:01:52 crc kubenswrapper[4574]: E1004 05:01:52.126715 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q4ln4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-hfm8z_openstack-operators(60dfec70-f10c-4d73-9933-f2cb76124090): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.107850 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" podUID="28570522-1dff-475f-8ab0-963f4ac14534" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.173771 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" podUID="4552356b-ed71-465f-beb5-26c4a63dc81d" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.194313 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" podUID="d4f548d4-c2a0-4756-a55a-3d398b81d923" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.261591 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" podUID="39766d86-7ab2-42ca-b6ae-0e02eb871cc3" Oct 04 05:01:57 crc kubenswrapper[4574]: I1004 05:01:57.261811 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" event={"ID":"1edbf723-752f-416b-a922-12a73521d6f9","Type":"ContainerStarted","Data":"ba775cb932739c774e4f650258595da15c235ff1beb2dbef80f53041a87cb340"} Oct 04 05:01:57 crc kubenswrapper[4574]: I1004 05:01:57.265610 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" event={"ID":"28570522-1dff-475f-8ab0-963f4ac14534","Type":"ContainerStarted","Data":"8a2e3db04c5b73c13bbf0c9a344fa1c58cbe2eefbee13832cce11424da552b88"} Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.266813 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" podUID="28570522-1dff-475f-8ab0-963f4ac14534" Oct 04 05:01:57 crc kubenswrapper[4574]: I1004 05:01:57.268418 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" event={"ID":"d4f548d4-c2a0-4756-a55a-3d398b81d923","Type":"ContainerStarted","Data":"3d16af1db09d1d3eccc35d97466cbff09497e318b71d291e1d690d9abfd84ef2"} Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.276896 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" podUID="d4f548d4-c2a0-4756-a55a-3d398b81d923" Oct 04 05:01:57 crc kubenswrapper[4574]: I1004 05:01:57.298674 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" event={"ID":"90b04996-9e73-45c9-a03c-59e4bedf4ff4","Type":"ContainerStarted","Data":"353b9a131cf62c4f60bea75754ab82e9495133784ff64c5615551fc597549200"} Oct 04 05:01:57 crc kubenswrapper[4574]: I1004 05:01:57.311075 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" event={"ID":"4552356b-ed71-465f-beb5-26c4a63dc81d","Type":"ContainerStarted","Data":"345c6054458a598dc5592cdefd06840dafe4a5abe891bbc04f8ab0bc6566721e"} Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.312404 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" podUID="4552356b-ed71-465f-beb5-26c4a63dc81d" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.315644 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" podUID="60dfec70-f10c-4d73-9933-f2cb76124090" Oct 04 05:01:57 crc kubenswrapper[4574]: I1004 05:01:57.335407 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" event={"ID":"52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd","Type":"ContainerStarted","Data":"77f18ac71c9812fffdc75c5e588299a8534ac8f74391b6c8c96690e42fd4e41d"} Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.410267 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" podUID="85b1921d-1572-4aff-b002-2f31c2f270b4" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.469166 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" podUID="cb68cf9f-4ba2-410a-85f7-1db627311ff6" Oct 04 05:01:57 crc kubenswrapper[4574]: E1004 05:01:57.484587 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" podUID="d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.351377 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" event={"ID":"f0b7b141-c133-4487-9ecb-fab0b12d82bb","Type":"ContainerStarted","Data":"a15407a51ef17488a12df722b36b59b4fed7e04584669bcb650dd1f8a2238984"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.352438 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.355084 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" event={"ID":"d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7","Type":"ContainerStarted","Data":"3eec9ec53fbafaf741c634e638fa82bf5d102a2b0fcfb6725a95c0da665e108e"} Oct 04 05:01:58 crc kubenswrapper[4574]: E1004 05:01:58.356500 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55\\\"\"" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" podUID="d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.364538 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" event={"ID":"52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd","Type":"ContainerStarted","Data":"4571b2cbfa65a5653efd3a6384bfc564b8703d5dc15132399d8478204f546710"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.364615 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.370849 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" event={"ID":"60dfec70-f10c-4d73-9933-f2cb76124090","Type":"ContainerStarted","Data":"9752637e4622f09d333be337d0aab90bb0b4c497e555006853d69e2376111762"} Oct 04 05:01:58 crc kubenswrapper[4574]: E1004 05:01:58.373034 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" podUID="60dfec70-f10c-4d73-9933-f2cb76124090" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.381450 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" event={"ID":"cb68cf9f-4ba2-410a-85f7-1db627311ff6","Type":"ContainerStarted","Data":"b3ff5d5e0040aed73d6b7c24c5bc82b31077583e4dda97cd2ae62220934e3f62"} Oct 04 05:01:58 crc kubenswrapper[4574]: E1004 05:01:58.382543 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" podUID="cb68cf9f-4ba2-410a-85f7-1db627311ff6" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.385576 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" event={"ID":"d552b4e4-9120-4d96-8615-fa6d68a71042","Type":"ContainerStarted","Data":"c6c67f64c8d520031bd169a818dcfa6338a438febff629a632b5b826afcba15d"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.385615 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" event={"ID":"d552b4e4-9120-4d96-8615-fa6d68a71042","Type":"ContainerStarted","Data":"be5a77cc0fbf80cb1d641603e962b35b094a83cc5d316b12dd093fb8a32d6632"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.385728 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.390506 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" event={"ID":"a95cec28-a993-4f56-b540-18ad84c5bd2d","Type":"ContainerStarted","Data":"24d13e0ff344031a43320cc05d87f875ee7b56aa61710e9c99f51d2693b7dec7"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.401396 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" event={"ID":"54443166-57a5-4e11-914c-d9cb2f3252d7","Type":"ContainerStarted","Data":"70f53396a57e71c19e6c328f466f0b09c85883a11dd6fdb5c2bf2b738eb2e1f6"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.401772 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.404292 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" event={"ID":"39766d86-7ab2-42ca-b6ae-0e02eb871cc3","Type":"ContainerStarted","Data":"ab9650ee47eef7e3dddb1fd8eb02547cae6430ae2dcf650415065ae413190c56"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.406167 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" event={"ID":"90b04996-9e73-45c9-a03c-59e4bedf4ff4","Type":"ContainerStarted","Data":"7ab86354f37942a6d918ac8e730411871bcb852ab95a6cb6159a938ff4c92fa7"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.406844 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.408462 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" event={"ID":"55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4","Type":"ContainerStarted","Data":"06586da087891bfe3df76a824b3b7d1fc8e60d474963604c9fe19326c3547147"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.412078 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" event={"ID":"95f9af94-f839-464f-8c6f-8928146b0d26","Type":"ContainerStarted","Data":"f6609b139280e5f6bc94b4f7c18185da62c80b9d0f3d0d3d2b8f7ebf7d2b5f5e"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.412123 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" event={"ID":"95f9af94-f839-464f-8c6f-8928146b0d26","Type":"ContainerStarted","Data":"c7a01a542cf66a4f1c5a32aac493576db0326ff89c32b606743f6537065740d0"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.412425 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.415835 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" event={"ID":"e227d829-9a02-40dd-b0c5-012a7d024253","Type":"ContainerStarted","Data":"2ac72821420f28e1199dfc8b5f7e07e185467b2fb8e9c68d24047cf16b091f3d"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.416191 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.418508 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" event={"ID":"e288039e-c6d3-4911-b284-1eb1cd2bccf2","Type":"ContainerStarted","Data":"1530d8299da483e0bb2c510e3f0b0e54e5e4b97df7ff30fd67ef66c4b5cf84f1"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.419000 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.420437 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" event={"ID":"46bd489f-f708-4c7e-b697-39e9fd65a30e","Type":"ContainerStarted","Data":"cf282305bf8aadf83e433cd015048e7ed4655705ed8afbb066ee23b3cf0cf76d"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.422333 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" event={"ID":"f87750ff-5d28-4658-b7d4-bc49bcb35886","Type":"ContainerStarted","Data":"a46f72d41d9e6d87cfb8fd945bd6415b715dec7e8c95e61b483c70554f4eb697"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.422506 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.427834 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" event={"ID":"85b1921d-1572-4aff-b002-2f31c2f270b4","Type":"ContainerStarted","Data":"38cf2a8721b2b8befe9cdeee3d57345c297619520012a0922e9bbd42ad36a46f"} Oct 04 05:01:58 crc kubenswrapper[4574]: E1004 05:01:58.433150 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" podUID="85b1921d-1572-4aff-b002-2f31c2f270b4" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.433759 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" podStartSLOduration=5.187768342 podStartE2EDuration="27.433742876s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.450842608 +0000 UTC m=+920.304985650" lastFinishedPulling="2025-10-04 05:01:56.696817142 +0000 UTC m=+942.550960184" observedRunningTime="2025-10-04 05:01:58.433271933 +0000 UTC m=+944.287414975" watchObservedRunningTime="2025-10-04 05:01:58.433742876 +0000 UTC m=+944.287885918" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.448202 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" event={"ID":"9c976366-a9b2-4720-a5ce-2aeffaf0dad2","Type":"ContainerStarted","Data":"e767db350e52dac33436241289f114d29f98ce74e62063bff9000d4f5f26d63e"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.448298 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" event={"ID":"9c976366-a9b2-4720-a5ce-2aeffaf0dad2","Type":"ContainerStarted","Data":"5dfafa58bf9493765220dcd09517b4fc1105d81efc9b511ab35050db5c2c2000"} Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.448704 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:01:58 crc kubenswrapper[4574]: E1004 05:01:58.461486 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:c438734cc669f60ba9d4692fab478cbd326c7de2539d482a21de54a1384ad7ac\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" podUID="d4f548d4-c2a0-4756-a55a-3d398b81d923" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.478998 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" podStartSLOduration=6.298900088 podStartE2EDuration="27.478984771s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.432548267 +0000 UTC m=+920.286691319" lastFinishedPulling="2025-10-04 05:01:55.61263296 +0000 UTC m=+941.466776002" observedRunningTime="2025-10-04 05:01:58.475709085 +0000 UTC m=+944.329852127" watchObservedRunningTime="2025-10-04 05:01:58.478984771 +0000 UTC m=+944.333127803" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.519135 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" podStartSLOduration=5.327085829 podStartE2EDuration="27.519120396s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.50358416 +0000 UTC m=+920.357727202" lastFinishedPulling="2025-10-04 05:01:56.695618727 +0000 UTC m=+942.549761769" observedRunningTime="2025-10-04 05:01:58.512127343 +0000 UTC m=+944.366270385" watchObservedRunningTime="2025-10-04 05:01:58.519120396 +0000 UTC m=+944.373263438" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.581602 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2" podStartSLOduration=4.348747789 podStartE2EDuration="26.581583481s" podCreationTimestamp="2025-10-04 05:01:32 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.452894678 +0000 UTC m=+920.307037720" lastFinishedPulling="2025-10-04 05:01:56.68573037 +0000 UTC m=+942.539873412" observedRunningTime="2025-10-04 05:01:58.576773731 +0000 UTC m=+944.430916773" watchObservedRunningTime="2025-10-04 05:01:58.581583481 +0000 UTC m=+944.435726523" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.636255 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" podStartSLOduration=7.546767495 podStartE2EDuration="27.636225568s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.390846037 +0000 UTC m=+919.244989079" lastFinishedPulling="2025-10-04 05:01:53.4803041 +0000 UTC m=+939.334447152" observedRunningTime="2025-10-04 05:01:58.611361176 +0000 UTC m=+944.465504218" watchObservedRunningTime="2025-10-04 05:01:58.636225568 +0000 UTC m=+944.490368610" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.655226 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" podStartSLOduration=8.011038632 podStartE2EDuration="27.6552081s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.83640451 +0000 UTC m=+919.690547542" lastFinishedPulling="2025-10-04 05:01:53.480573968 +0000 UTC m=+939.334717010" observedRunningTime="2025-10-04 05:01:58.633612802 +0000 UTC m=+944.487755844" watchObservedRunningTime="2025-10-04 05:01:58.6552081 +0000 UTC m=+944.509351142" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.732143 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" podStartSLOduration=8.144313603 podStartE2EDuration="27.732124524s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.904781526 +0000 UTC m=+919.758924568" lastFinishedPulling="2025-10-04 05:01:53.492592447 +0000 UTC m=+939.346735489" observedRunningTime="2025-10-04 05:01:58.685750297 +0000 UTC m=+944.539893339" watchObservedRunningTime="2025-10-04 05:01:58.732124524 +0000 UTC m=+944.586267576" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.733033 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" podStartSLOduration=5.235360115 podStartE2EDuration="27.73302732s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.199058194 +0000 UTC m=+920.053201236" lastFinishedPulling="2025-10-04 05:01:56.696725399 +0000 UTC m=+942.550868441" observedRunningTime="2025-10-04 05:01:58.730872767 +0000 UTC m=+944.585015809" watchObservedRunningTime="2025-10-04 05:01:58.73302732 +0000 UTC m=+944.587170352" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.855983 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" podStartSLOduration=8.264119412 podStartE2EDuration="27.855968881s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.911607224 +0000 UTC m=+919.765750266" lastFinishedPulling="2025-10-04 05:01:53.503456693 +0000 UTC m=+939.357599735" observedRunningTime="2025-10-04 05:01:58.852304465 +0000 UTC m=+944.706447507" watchObservedRunningTime="2025-10-04 05:01:58.855968881 +0000 UTC m=+944.710111923" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.874669 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" podStartSLOduration=5.596209087 podStartE2EDuration="27.874650414s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.430027214 +0000 UTC m=+920.284170256" lastFinishedPulling="2025-10-04 05:01:56.708468541 +0000 UTC m=+942.562611583" observedRunningTime="2025-10-04 05:01:58.871165203 +0000 UTC m=+944.725308245" watchObservedRunningTime="2025-10-04 05:01:58.874650414 +0000 UTC m=+944.728793536" Oct 04 05:01:58 crc kubenswrapper[4574]: I1004 05:01:58.977149 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" podStartSLOduration=7.748251118 podStartE2EDuration="27.977129121s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.252143828 +0000 UTC m=+919.106286870" lastFinishedPulling="2025-10-04 05:01:53.481021831 +0000 UTC m=+939.335164873" observedRunningTime="2025-10-04 05:01:58.975189334 +0000 UTC m=+944.829332376" watchObservedRunningTime="2025-10-04 05:01:58.977129121 +0000 UTC m=+944.831272163" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.468777 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" event={"ID":"46bd489f-f708-4c7e-b697-39e9fd65a30e","Type":"ContainerStarted","Data":"ddd1f42d67b9e3b3611faffb52713d910ad58d8d1a9f0335c3ac061d2afd30c9"} Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.468892 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.470410 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" event={"ID":"1edbf723-752f-416b-a922-12a73521d6f9","Type":"ContainerStarted","Data":"1d4c5307bd16bf2a8ad457ebbe81268037e3415870a1267811d89be0792d542b"} Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.471003 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.471995 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" event={"ID":"39766d86-7ab2-42ca-b6ae-0e02eb871cc3","Type":"ContainerStarted","Data":"c49a432c37028ff51404d6954fe757bf68126a43ecc62d7a31cb0b4030600be1"} Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.472260 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.474968 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" event={"ID":"55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4","Type":"ContainerStarted","Data":"264776f698dddd34a42341b19c37c8581bae90f4dcf7215c95abfd86e9524685"} Oct 04 05:01:59 crc kubenswrapper[4574]: E1004 05:01:59.475919 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" podUID="cb68cf9f-4ba2-410a-85f7-1db627311ff6" Oct 04 05:01:59 crc kubenswrapper[4574]: E1004 05:01:59.477143 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:0cefa320e45c741f8bffea583eeb6cf7465c4e0a183ae51614bf4b7677ffcb55\\\"\"" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" podUID="d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7" Oct 04 05:01:59 crc kubenswrapper[4574]: E1004 05:01:59.477694 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" podUID="85b1921d-1572-4aff-b002-2f31c2f270b4" Oct 04 05:01:59 crc kubenswrapper[4574]: E1004 05:01:59.480479 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" podUID="60dfec70-f10c-4d73-9933-f2cb76124090" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.527674 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" podStartSLOduration=9.200563975 podStartE2EDuration="28.527653872s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.174558513 +0000 UTC m=+920.028701555" lastFinishedPulling="2025-10-04 05:01:53.50164841 +0000 UTC m=+939.355791452" observedRunningTime="2025-10-04 05:01:59.49795859 +0000 UTC m=+945.352101652" watchObservedRunningTime="2025-10-04 05:01:59.527653872 +0000 UTC m=+945.381796914" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.566458 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" podStartSLOduration=9.22589418 podStartE2EDuration="28.566437349s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.17000655 +0000 UTC m=+920.024149592" lastFinishedPulling="2025-10-04 05:01:53.510549719 +0000 UTC m=+939.364692761" observedRunningTime="2025-10-04 05:01:59.541812974 +0000 UTC m=+945.395956016" watchObservedRunningTime="2025-10-04 05:01:59.566437349 +0000 UTC m=+945.420580391" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.566714 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" podStartSLOduration=4.116581517 podStartE2EDuration="28.566707217s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.413615109 +0000 UTC m=+919.267758151" lastFinishedPulling="2025-10-04 05:01:57.863740809 +0000 UTC m=+943.717883851" observedRunningTime="2025-10-04 05:01:59.559965861 +0000 UTC m=+945.414108903" watchObservedRunningTime="2025-10-04 05:01:59.566707217 +0000 UTC m=+945.420850259" Oct 04 05:01:59 crc kubenswrapper[4574]: I1004 05:01:59.639872 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" podStartSLOduration=9.012795512 podStartE2EDuration="28.639853462s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.87460292 +0000 UTC m=+919.728745962" lastFinishedPulling="2025-10-04 05:01:53.50166087 +0000 UTC m=+939.355803912" observedRunningTime="2025-10-04 05:01:59.63669173 +0000 UTC m=+945.490834772" watchObservedRunningTime="2025-10-04 05:01:59.639853462 +0000 UTC m=+945.493996504" Oct 04 05:02:00 crc kubenswrapper[4574]: I1004 05:02:00.480026 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.150711 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c777dc986-cvjnd" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.169203 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-96hsk" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.199162 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-88mfj" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.246821 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-jt72t" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.256734 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-t222j" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.633752 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-sxfrz" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.694614 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-g2kpz" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.723391 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-2fzvp" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.778687 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mhxlg" Oct 04 05:02:02 crc kubenswrapper[4574]: I1004 05:02:02.903315 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cz7492" Oct 04 05:02:03 crc kubenswrapper[4574]: I1004 05:02:03.435608 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-gnpjd" Oct 04 05:02:04 crc kubenswrapper[4574]: I1004 05:02:04.512532 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" event={"ID":"4552356b-ed71-465f-beb5-26c4a63dc81d","Type":"ContainerStarted","Data":"1340eb4ba34972b859d5a06f16b4283dfc4f5933e1bda7c43acf1eb44c500084"} Oct 04 05:02:04 crc kubenswrapper[4574]: I1004 05:02:04.512799 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:02:04 crc kubenswrapper[4574]: I1004 05:02:04.515253 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" event={"ID":"28570522-1dff-475f-8ab0-963f4ac14534","Type":"ContainerStarted","Data":"d138f2d7ad6e9e37f8e6cf55afeebfb3ea9c61487f636def7fdae05dd3b12065"} Oct 04 05:02:04 crc kubenswrapper[4574]: I1004 05:02:04.515778 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:02:04 crc kubenswrapper[4574]: I1004 05:02:04.535000 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" podStartSLOduration=2.725745545 podStartE2EDuration="33.534982333s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.23913049 +0000 UTC m=+919.093273532" lastFinishedPulling="2025-10-04 05:02:04.048367278 +0000 UTC m=+949.902510320" observedRunningTime="2025-10-04 05:02:04.53351746 +0000 UTC m=+950.387660502" watchObservedRunningTime="2025-10-04 05:02:04.534982333 +0000 UTC m=+950.389125375" Oct 04 05:02:04 crc kubenswrapper[4574]: I1004 05:02:04.552578 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" podStartSLOduration=3.366578191 podStartE2EDuration="33.552561043s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.861348485 +0000 UTC m=+919.715491527" lastFinishedPulling="2025-10-04 05:02:04.047331337 +0000 UTC m=+949.901474379" observedRunningTime="2025-10-04 05:02:04.548749153 +0000 UTC m=+950.402892195" watchObservedRunningTime="2025-10-04 05:02:04.552561043 +0000 UTC m=+950.406704085" Oct 04 05:02:09 crc kubenswrapper[4574]: I1004 05:02:09.554702 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" event={"ID":"d4f548d4-c2a0-4756-a55a-3d398b81d923","Type":"ContainerStarted","Data":"3d7fac54d9ff25c2e2a2333a6117074111e5cf4477e0257f636a7ff301076b4c"} Oct 04 05:02:09 crc kubenswrapper[4574]: I1004 05:02:09.555900 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:02:09 crc kubenswrapper[4574]: I1004 05:02:09.575000 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" podStartSLOduration=3.239159659 podStartE2EDuration="38.574981894s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.849284874 +0000 UTC m=+919.703427916" lastFinishedPulling="2025-10-04 05:02:09.185107109 +0000 UTC m=+955.039250151" observedRunningTime="2025-10-04 05:02:09.570904526 +0000 UTC m=+955.425047568" watchObservedRunningTime="2025-10-04 05:02:09.574981894 +0000 UTC m=+955.429124936" Oct 04 05:02:11 crc kubenswrapper[4574]: I1004 05:02:11.682381 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-mgwq7" Oct 04 05:02:11 crc kubenswrapper[4574]: I1004 05:02:11.694296 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-9t5xx" Oct 04 05:02:11 crc kubenswrapper[4574]: I1004 05:02:11.790769 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-pmvc8" Oct 04 05:02:12 crc kubenswrapper[4574]: I1004 05:02:12.028160 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-qbzx8" Oct 04 05:02:12 crc kubenswrapper[4574]: I1004 05:02:12.398692 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-b4fbd" Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.588323 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" event={"ID":"85b1921d-1572-4aff-b002-2f31c2f270b4","Type":"ContainerStarted","Data":"8cb34adf1515dfc0970958246da7443d3f8b96dd15cebf6ded57cbb0cc8e4ece"} Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.589883 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.593537 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" event={"ID":"60dfec70-f10c-4d73-9933-f2cb76124090","Type":"ContainerStarted","Data":"60cf7bf828b4efa8dc289f0f78dd1469ca9bd85973dffda1b313eb7dd7567978"} Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.594249 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.602589 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" event={"ID":"d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7","Type":"ContainerStarted","Data":"92631655bc21478f2c820f001cacb4bd832da8d125c122d94898e525178a0eac"} Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.602825 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.623105 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" podStartSLOduration=3.297749211 podStartE2EDuration="42.623082241s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.829198291 +0000 UTC m=+919.683341333" lastFinishedPulling="2025-10-04 05:02:13.154531321 +0000 UTC m=+959.008674363" observedRunningTime="2025-10-04 05:02:13.618641102 +0000 UTC m=+959.472784164" watchObservedRunningTime="2025-10-04 05:02:13.623082241 +0000 UTC m=+959.477225283" Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.646802 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" podStartSLOduration=2.740085582 podStartE2EDuration="42.646784199s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:33.371546477 +0000 UTC m=+919.225689519" lastFinishedPulling="2025-10-04 05:02:13.278245094 +0000 UTC m=+959.132388136" observedRunningTime="2025-10-04 05:02:13.643736931 +0000 UTC m=+959.497879973" watchObservedRunningTime="2025-10-04 05:02:13.646784199 +0000 UTC m=+959.500927241" Oct 04 05:02:13 crc kubenswrapper[4574]: I1004 05:02:13.663541 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" podStartSLOduration=3.581465472 podStartE2EDuration="42.663523006s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.241471876 +0000 UTC m=+920.095614918" lastFinishedPulling="2025-10-04 05:02:13.32352941 +0000 UTC m=+959.177672452" observedRunningTime="2025-10-04 05:02:13.663145615 +0000 UTC m=+959.517288657" watchObservedRunningTime="2025-10-04 05:02:13.663523006 +0000 UTC m=+959.517666058" Oct 04 05:02:14 crc kubenswrapper[4574]: I1004 05:02:14.610473 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" event={"ID":"cb68cf9f-4ba2-410a-85f7-1db627311ff6","Type":"ContainerStarted","Data":"2347c6594e19864f99a122d59d6cebad268cab023fc3ad20d223bbb61fcd5f98"} Oct 04 05:02:14 crc kubenswrapper[4574]: I1004 05:02:14.610782 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:02:14 crc kubenswrapper[4574]: I1004 05:02:14.632601 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" podStartSLOduration=4.644117238 podStartE2EDuration="43.632579443s" podCreationTimestamp="2025-10-04 05:01:31 +0000 UTC" firstStartedPulling="2025-10-04 05:01:34.450053025 +0000 UTC m=+920.304196067" lastFinishedPulling="2025-10-04 05:02:13.43851523 +0000 UTC m=+959.292658272" observedRunningTime="2025-10-04 05:02:14.626222589 +0000 UTC m=+960.480365631" watchObservedRunningTime="2025-10-04 05:02:14.632579443 +0000 UTC m=+960.486722495" Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.404820 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.405401 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.405446 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.406027 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a99234efe67f037290baa95758d3a1f0d549bea91113058aaa5fd090767eb42e"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.406074 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://a99234efe67f037290baa95758d3a1f0d549bea91113058aaa5fd090767eb42e" gracePeriod=600 Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.642367 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="a99234efe67f037290baa95758d3a1f0d549bea91113058aaa5fd090767eb42e" exitCode=0 Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.642440 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"a99234efe67f037290baa95758d3a1f0d549bea91113058aaa5fd090767eb42e"} Oct 04 05:02:19 crc kubenswrapper[4574]: I1004 05:02:19.642472 4574 scope.go:117] "RemoveContainer" containerID="6a0b072b2db63c5fef6028adb7e7cc7f770356e62fc2cc2752bf99549d02a71e" Oct 04 05:02:20 crc kubenswrapper[4574]: I1004 05:02:20.651056 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"0c021ed99dab79e0bc143879c96505d7aa34ab49c6d5b17fbf9b9b39bbe04b86"} Oct 04 05:02:21 crc kubenswrapper[4574]: I1004 05:02:21.833589 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-mdh2j" Oct 04 05:02:22 crc kubenswrapper[4574]: I1004 05:02:22.101982 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-xcjwv" Oct 04 05:02:22 crc kubenswrapper[4574]: I1004 05:02:22.179727 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j" Oct 04 05:02:22 crc kubenswrapper[4574]: I1004 05:02:22.723297 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-hfm8z" Oct 04 05:02:22 crc kubenswrapper[4574]: I1004 05:02:22.826994 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-llj5f" Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.906106 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6gb9q"] Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.907806 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.912328 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.912349 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.912349 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.912613 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r7sn7" Oct 04 05:02:38 crc kubenswrapper[4574]: I1004 05:02:38.931220 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6gb9q"] Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.038259 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-config\") pod \"dnsmasq-dns-675f4bcbfc-6gb9q\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.038337 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hx5\" (UniqueName: \"kubernetes.io/projected/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-kube-api-access-f4hx5\") pod \"dnsmasq-dns-675f4bcbfc-6gb9q\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.050250 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppq64"] Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.052649 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.059073 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.069220 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppq64"] Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.139507 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-config\") pod \"dnsmasq-dns-675f4bcbfc-6gb9q\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.139574 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hx5\" (UniqueName: \"kubernetes.io/projected/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-kube-api-access-f4hx5\") pod \"dnsmasq-dns-675f4bcbfc-6gb9q\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.139621 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.139644 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr6c7\" (UniqueName: \"kubernetes.io/projected/724298ee-77dc-4d83-a3e5-24d40041670c-kube-api-access-mr6c7\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.139697 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-config\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.140676 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-config\") pod \"dnsmasq-dns-675f4bcbfc-6gb9q\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.160196 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hx5\" (UniqueName: \"kubernetes.io/projected/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-kube-api-access-f4hx5\") pod \"dnsmasq-dns-675f4bcbfc-6gb9q\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.230545 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.244624 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-config\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.244718 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.244743 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr6c7\" (UniqueName: \"kubernetes.io/projected/724298ee-77dc-4d83-a3e5-24d40041670c-kube-api-access-mr6c7\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.245788 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-config\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.245882 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.293436 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr6c7\" (UniqueName: \"kubernetes.io/projected/724298ee-77dc-4d83-a3e5-24d40041670c-kube-api-access-mr6c7\") pod \"dnsmasq-dns-78dd6ddcc-ppq64\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.378756 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.794022 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppq64"] Oct 04 05:02:39 crc kubenswrapper[4574]: I1004 05:02:39.800850 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:02:40 crc kubenswrapper[4574]: I1004 05:02:40.062504 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6gb9q"] Oct 04 05:02:40 crc kubenswrapper[4574]: W1004 05:02:40.066213 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8150a80a_d8b5_481a_a50f_5b04fc35dcc3.slice/crio-b452c22e9e2bdee2e31126b6e0f2ac72cce39c22c0adf0d39cee82c916ca1a6e WatchSource:0}: Error finding container b452c22e9e2bdee2e31126b6e0f2ac72cce39c22c0adf0d39cee82c916ca1a6e: Status 404 returned error can't find the container with id b452c22e9e2bdee2e31126b6e0f2ac72cce39c22c0adf0d39cee82c916ca1a6e Oct 04 05:02:40 crc kubenswrapper[4574]: I1004 05:02:40.804883 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" event={"ID":"8150a80a-d8b5-481a-a50f-5b04fc35dcc3","Type":"ContainerStarted","Data":"b452c22e9e2bdee2e31126b6e0f2ac72cce39c22c0adf0d39cee82c916ca1a6e"} Oct 04 05:02:40 crc kubenswrapper[4574]: I1004 05:02:40.807899 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" event={"ID":"724298ee-77dc-4d83-a3e5-24d40041670c","Type":"ContainerStarted","Data":"3ae659106de4bfbd2e38a79a5ca6dfdc96c2ad2eb8f5ed276ce67c913a4ff115"} Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.069800 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6gb9q"] Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.111677 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-twlxf"] Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.115423 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.120946 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-twlxf"] Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.196776 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.196915 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hj4\" (UniqueName: \"kubernetes.io/projected/2ab095f1-eeec-4911-bc7b-35acc57e729c-kube-api-access-99hj4\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.196970 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-config\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.298692 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-config\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.298773 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.298865 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99hj4\" (UniqueName: \"kubernetes.io/projected/2ab095f1-eeec-4911-bc7b-35acc57e729c-kube-api-access-99hj4\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.299975 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-config\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.301307 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.331140 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hj4\" (UniqueName: \"kubernetes.io/projected/2ab095f1-eeec-4911-bc7b-35acc57e729c-kube-api-access-99hj4\") pod \"dnsmasq-dns-5ccc8479f9-twlxf\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.448600 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.645022 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppq64"] Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.703292 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xbt7f"] Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.704692 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.727638 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xbt7f"] Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.806969 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cltq\" (UniqueName: \"kubernetes.io/projected/077e1f13-f5ea-4812-b14a-cf42ec68bb53-kube-api-access-8cltq\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.807089 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.807161 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-config\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.912041 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cltq\" (UniqueName: \"kubernetes.io/projected/077e1f13-f5ea-4812-b14a-cf42ec68bb53-kube-api-access-8cltq\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.912409 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.912453 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-config\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.913288 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-config\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.913939 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:42 crc kubenswrapper[4574]: I1004 05:02:42.952476 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cltq\" (UniqueName: \"kubernetes.io/projected/077e1f13-f5ea-4812-b14a-cf42ec68bb53-kube-api-access-8cltq\") pod \"dnsmasq-dns-57d769cc4f-xbt7f\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.031619 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.247280 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-twlxf"] Oct 04 05:02:43 crc kubenswrapper[4574]: W1004 05:02:43.274425 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab095f1_eeec_4911_bc7b_35acc57e729c.slice/crio-a52e844bc757afb26546c7f992831b673a698cfd43a0a7ff273a2d7068c585d5 WatchSource:0}: Error finding container a52e844bc757afb26546c7f992831b673a698cfd43a0a7ff273a2d7068c585d5: Status 404 returned error can't find the container with id a52e844bc757afb26546c7f992831b673a698cfd43a0a7ff273a2d7068c585d5 Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.355150 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.356473 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.359137 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.366826 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.366873 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.366911 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.367026 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mwpqk" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.367065 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.367351 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.373273 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521173 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521266 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521395 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e3699c-e19d-4c38-b763-32af874a1a90-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521453 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e3699c-e19d-4c38-b763-32af874a1a90-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521501 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521523 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521552 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521594 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521617 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521640 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.521668 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2vn\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-kube-api-access-8c2vn\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623080 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623136 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e3699c-e19d-4c38-b763-32af874a1a90-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623175 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e3699c-e19d-4c38-b763-32af874a1a90-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623209 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623249 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623273 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623314 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623337 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623361 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623385 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2vn\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-kube-api-access-8c2vn\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.623414 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.624658 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.627098 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.627684 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.627757 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.627682 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.629038 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.635333 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.635741 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e3699c-e19d-4c38-b763-32af874a1a90-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.644932 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.646634 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e3699c-e19d-4c38-b763-32af874a1a90-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.659175 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2vn\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-kube-api-access-8c2vn\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.671315 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.679860 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xbt7f"] Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.697831 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.864965 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" event={"ID":"2ab095f1-eeec-4911-bc7b-35acc57e729c","Type":"ContainerStarted","Data":"a52e844bc757afb26546c7f992831b673a698cfd43a0a7ff273a2d7068c585d5"} Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.869102 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" event={"ID":"077e1f13-f5ea-4812-b14a-cf42ec68bb53","Type":"ContainerStarted","Data":"deecbe12a2aca0bec3f2dea8786b8c0fcd04dd6c27442113354f56dd829df4ab"} Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.889384 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.900475 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.912771 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.916174 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.918145 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.921022 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.921145 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.921665 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-s8hzq" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.921961 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 04 05:02:43 crc kubenswrapper[4574]: I1004 05:02:43.931938 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028318 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028377 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9x9\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-kube-api-access-xs9x9\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028413 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028435 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16df8292-9780-4212-a920-bf0eed95da87-pod-info\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028462 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028493 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16df8292-9780-4212-a920-bf0eed95da87-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028519 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028557 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-config-data\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028590 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-server-conf\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028734 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.028764 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.130186 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.130702 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9x9\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-kube-api-access-xs9x9\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.130745 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.130773 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16df8292-9780-4212-a920-bf0eed95da87-pod-info\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131023 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131510 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131581 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16df8292-9780-4212-a920-bf0eed95da87-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131624 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131691 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-config-data\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131747 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-server-conf\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131840 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.131883 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.132036 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.132140 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.133465 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-config-data\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.133883 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-server-conf\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.138946 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.139484 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.145798 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.149391 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16df8292-9780-4212-a920-bf0eed95da87-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.151910 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16df8292-9780-4212-a920-bf0eed95da87-pod-info\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.152693 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9x9\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-kube-api-access-xs9x9\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.163450 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.253520 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.272449 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:02:44 crc kubenswrapper[4574]: W1004 05:02:44.295555 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e3699c_e19d_4c38_b763_32af874a1a90.slice/crio-7ae442e3be29bb3902342d6eb938cefa74befaf7e9e43c5bc607bf61cb8f2c98 WatchSource:0}: Error finding container 7ae442e3be29bb3902342d6eb938cefa74befaf7e9e43c5bc607bf61cb8f2c98: Status 404 returned error can't find the container with id 7ae442e3be29bb3902342d6eb938cefa74befaf7e9e43c5bc607bf61cb8f2c98 Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.862351 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:02:44 crc kubenswrapper[4574]: I1004 05:02:44.883171 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e3699c-e19d-4c38-b763-32af874a1a90","Type":"ContainerStarted","Data":"7ae442e3be29bb3902342d6eb938cefa74befaf7e9e43c5bc607bf61cb8f2c98"} Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.370435 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.401496 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.403449 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.407229 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.407440 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.407715 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.414195 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mq52n" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.416253 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.429981 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611361 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611414 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611441 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbj7b\" (UniqueName: \"kubernetes.io/projected/f275e3ec-6c93-412b-875c-65b03a785dc0-kube-api-access-kbj7b\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611461 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f275e3ec-6c93-412b-875c-65b03a785dc0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611491 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611535 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611566 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611580 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.611602 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.638483 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.646504 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.655110 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.655432 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.655577 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5qm9z" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.655717 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.665139 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714755 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714808 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714848 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714896 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714930 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714958 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbj7b\" (UniqueName: \"kubernetes.io/projected/f275e3ec-6c93-412b-875c-65b03a785dc0-kube-api-access-kbj7b\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.714981 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f275e3ec-6c93-412b-875c-65b03a785dc0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.715018 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.715074 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.716058 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.716742 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.717266 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f275e3ec-6c93-412b-875c-65b03a785dc0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.717609 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f275e3ec-6c93-412b-875c-65b03a785dc0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.717769 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.740402 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.746715 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.748353 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.758829 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-px7hn" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.759038 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.759176 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.774648 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.788928 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.791911 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f275e3ec-6c93-412b-875c-65b03a785dc0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817142 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817210 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817286 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817326 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-secrets\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817388 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-config-data-default\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817412 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817454 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-kolla-config\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817475 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8848\" (UniqueName: \"kubernetes.io/projected/c862e2a0-256a-470f-b35b-c244555f0c5f-kube-api-access-b8848\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.817524 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c862e2a0-256a-470f-b35b-c244555f0c5f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.835151 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbj7b\" (UniqueName: \"kubernetes.io/projected/f275e3ec-6c93-412b-875c-65b03a785dc0-kube-api-access-kbj7b\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.877340 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f275e3ec-6c93-412b-875c-65b03a785dc0\") " pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.921626 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-secrets\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.921707 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-config-data\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.921739 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndcs\" (UniqueName: \"kubernetes.io/projected/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-kube-api-access-qndcs\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.921832 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-config-data-default\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.921865 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.921953 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-kolla-config\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922005 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8848\" (UniqueName: \"kubernetes.io/projected/c862e2a0-256a-470f-b35b-c244555f0c5f-kube-api-access-b8848\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922040 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-kolla-config\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922106 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c862e2a0-256a-470f-b35b-c244555f0c5f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922169 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922222 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922289 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922368 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.922447 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.924188 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-config-data-default\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.925826 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c862e2a0-256a-470f-b35b-c244555f0c5f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.926985 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.927112 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.927169 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c862e2a0-256a-470f-b35b-c244555f0c5f-kolla-config\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.927610 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.930755 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-secrets\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.933932 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e2a0-256a-470f-b35b-c244555f0c5f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.948153 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8848\" (UniqueName: \"kubernetes.io/projected/c862e2a0-256a-470f-b35b-c244555f0c5f-kube-api-access-b8848\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.955497 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"c862e2a0-256a-470f-b35b-c244555f0c5f\") " pod="openstack/openstack-galera-0" Oct 04 05:02:47 crc kubenswrapper[4574]: I1004 05:02:47.994870 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.024694 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.024823 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-config-data\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.024854 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qndcs\" (UniqueName: \"kubernetes.io/projected/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-kube-api-access-qndcs\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.024911 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-kolla-config\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.024948 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.027038 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-config-data\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.027677 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-kolla-config\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.031096 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.031743 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.041807 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.047152 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qndcs\" (UniqueName: \"kubernetes.io/projected/58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3-kube-api-access-qndcs\") pod \"memcached-0\" (UID: \"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3\") " pod="openstack/memcached-0" Oct 04 05:02:48 crc kubenswrapper[4574]: I1004 05:02:48.194158 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 04 05:02:49 crc kubenswrapper[4574]: I1004 05:02:49.976398 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:02:49 crc kubenswrapper[4574]: I1004 05:02:49.989348 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 05:02:49 crc kubenswrapper[4574]: I1004 05:02:49.991464 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:02:49 crc kubenswrapper[4574]: I1004 05:02:49.994684 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-99fts" Oct 04 05:02:50 crc kubenswrapper[4574]: I1004 05:02:50.065770 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qqz\" (UniqueName: \"kubernetes.io/projected/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339-kube-api-access-f4qqz\") pod \"kube-state-metrics-0\" (UID: \"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339\") " pod="openstack/kube-state-metrics-0" Oct 04 05:02:50 crc kubenswrapper[4574]: I1004 05:02:50.169883 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qqz\" (UniqueName: \"kubernetes.io/projected/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339-kube-api-access-f4qqz\") pod \"kube-state-metrics-0\" (UID: \"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339\") " pod="openstack/kube-state-metrics-0" Oct 04 05:02:50 crc kubenswrapper[4574]: I1004 05:02:50.228564 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qqz\" (UniqueName: \"kubernetes.io/projected/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339-kube-api-access-f4qqz\") pod \"kube-state-metrics-0\" (UID: \"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339\") " pod="openstack/kube-state-metrics-0" Oct 04 05:02:50 crc kubenswrapper[4574]: I1004 05:02:50.336788 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.581032 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.590324 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.594041 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.595113 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.596044 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.596260 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.596463 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.596635 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-v4gvv" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723085 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd1d5524-8818-4988-9969-45c2f2904fb4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723196 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723294 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd1d5524-8818-4988-9969-45c2f2904fb4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723326 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm4t8\" (UniqueName: \"kubernetes.io/projected/cd1d5524-8818-4988-9969-45c2f2904fb4-kube-api-access-dm4t8\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723387 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723532 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723649 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1d5524-8818-4988-9969-45c2f2904fb4-config\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.723759 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825010 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825092 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd1d5524-8818-4988-9969-45c2f2904fb4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825131 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825159 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd1d5524-8818-4988-9969-45c2f2904fb4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825176 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm4t8\" (UniqueName: \"kubernetes.io/projected/cd1d5524-8818-4988-9969-45c2f2904fb4-kube-api-access-dm4t8\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825204 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825340 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825343 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825382 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1d5524-8818-4988-9969-45c2f2904fb4-config\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.825666 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd1d5524-8818-4988-9969-45c2f2904fb4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.826477 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd1d5524-8818-4988-9969-45c2f2904fb4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.826548 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1d5524-8818-4988-9969-45c2f2904fb4-config\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.830508 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.830987 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.839072 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1d5524-8818-4988-9969-45c2f2904fb4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.861507 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.868136 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm4t8\" (UniqueName: \"kubernetes.io/projected/cd1d5524-8818-4988-9969-45c2f2904fb4-kube-api-access-dm4t8\") pod \"ovsdbserver-nb-0\" (UID: \"cd1d5524-8818-4988-9969-45c2f2904fb4\") " pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:52 crc kubenswrapper[4574]: I1004 05:02:52.916899 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.876366 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-khsmk"] Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.878649 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.883545 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.883562 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-msvrl" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.884400 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.908913 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gl29s"] Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.910583 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.943589 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-run\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.943968 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3836030c-f0c4-4392-bc54-cc817fd89934-ovn-controller-tls-certs\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944002 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-etc-ovs\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944037 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12012c68-85d3-4063-90d2-b80d4d169f38-scripts\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944079 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-run-ovn\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944108 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3836030c-f0c4-4392-bc54-cc817fd89934-scripts\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944142 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-run\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944180 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2lxh\" (UniqueName: \"kubernetes.io/projected/3836030c-f0c4-4392-bc54-cc817fd89934-kube-api-access-d2lxh\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944215 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-lib\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944335 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3836030c-f0c4-4392-bc54-cc817fd89934-combined-ca-bundle\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944368 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-log-ovn\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944406 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcb6\" (UniqueName: \"kubernetes.io/projected/12012c68-85d3-4063-90d2-b80d4d169f38-kube-api-access-jrcb6\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.944449 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-log\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.945873 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gl29s"] Oct 04 05:02:53 crc kubenswrapper[4574]: I1004 05:02:53.979311 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-khsmk"] Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045707 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-log-ovn\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045775 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcb6\" (UniqueName: \"kubernetes.io/projected/12012c68-85d3-4063-90d2-b80d4d169f38-kube-api-access-jrcb6\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045830 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-log\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045877 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-run\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045910 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3836030c-f0c4-4392-bc54-cc817fd89934-ovn-controller-tls-certs\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045939 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-etc-ovs\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.045970 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12012c68-85d3-4063-90d2-b80d4d169f38-scripts\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.046001 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-run-ovn\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.046021 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3836030c-f0c4-4392-bc54-cc817fd89934-scripts\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.046057 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-run\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.046091 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2lxh\" (UniqueName: \"kubernetes.io/projected/3836030c-f0c4-4392-bc54-cc817fd89934-kube-api-access-d2lxh\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.046125 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-lib\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.046149 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3836030c-f0c4-4392-bc54-cc817fd89934-combined-ca-bundle\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.047145 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-log\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.047284 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-log-ovn\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.047466 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-run\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.047510 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-run-ovn\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.047516 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3836030c-f0c4-4392-bc54-cc817fd89934-var-run\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.047713 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-var-lib\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.048026 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/12012c68-85d3-4063-90d2-b80d4d169f38-etc-ovs\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.049041 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12012c68-85d3-4063-90d2-b80d4d169f38-scripts\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.049989 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3836030c-f0c4-4392-bc54-cc817fd89934-combined-ca-bundle\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.051058 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3836030c-f0c4-4392-bc54-cc817fd89934-scripts\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.051501 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3836030c-f0c4-4392-bc54-cc817fd89934-ovn-controller-tls-certs\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.082140 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2lxh\" (UniqueName: \"kubernetes.io/projected/3836030c-f0c4-4392-bc54-cc817fd89934-kube-api-access-d2lxh\") pod \"ovn-controller-khsmk\" (UID: \"3836030c-f0c4-4392-bc54-cc817fd89934\") " pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.091093 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcb6\" (UniqueName: \"kubernetes.io/projected/12012c68-85d3-4063-90d2-b80d4d169f38-kube-api-access-jrcb6\") pod \"ovn-controller-ovs-gl29s\" (UID: \"12012c68-85d3-4063-90d2-b80d4d169f38\") " pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.199301 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk" Oct 04 05:02:54 crc kubenswrapper[4574]: I1004 05:02:54.225044 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:02:54 crc kubenswrapper[4574]: W1004 05:02:54.697515 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16df8292_9780_4212_a920_bf0eed95da87.slice/crio-0535ae0034bb52957ed5134e371173486b6506562458def59ef1a9efe987e125 WatchSource:0}: Error finding container 0535ae0034bb52957ed5134e371173486b6506562458def59ef1a9efe987e125: Status 404 returned error can't find the container with id 0535ae0034bb52957ed5134e371173486b6506562458def59ef1a9efe987e125 Oct 04 05:02:55 crc kubenswrapper[4574]: I1004 05:02:55.011081 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"16df8292-9780-4212-a920-bf0eed95da87","Type":"ContainerStarted","Data":"0535ae0034bb52957ed5134e371173486b6506562458def59ef1a9efe987e125"} Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.154638 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.915426 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.919782 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.924914 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.925560 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.925964 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-hcj2k" Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.927892 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 04 05:02:56 crc kubenswrapper[4574]: I1004 05:02:56.930703 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.007701 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.007766 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.007925 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.007951 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.007977 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.007997 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknjr\" (UniqueName: \"kubernetes.io/projected/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-kube-api-access-mknjr\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.008017 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.008074 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-config\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109673 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109745 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109836 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109862 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109890 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109911 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknjr\" (UniqueName: \"kubernetes.io/projected/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-kube-api-access-mknjr\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109929 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.109986 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-config\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.110498 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.111811 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.113291 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-config\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.113907 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.117535 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.117581 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.120743 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.129113 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknjr\" (UniqueName: \"kubernetes.io/projected/1e5b7c0f-9b1c-411c-94b0-f57b8157c998-kube-api-access-mknjr\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.135298 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e5b7c0f-9b1c-411c-94b0-f57b8157c998\") " pod="openstack/ovsdbserver-sb-0" Oct 04 05:02:57 crc kubenswrapper[4574]: I1004 05:02:57.250557 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 04 05:03:00 crc kubenswrapper[4574]: I1004 05:03:00.064699 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c862e2a0-256a-470f-b35b-c244555f0c5f","Type":"ContainerStarted","Data":"76e57b619a6115b8ee9bbe0eaaa5ba26b281c99893c9e0c2383b6d3fe1a84dc8"} Oct 04 05:03:05 crc kubenswrapper[4574]: I1004 05:03:05.183458 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 04 05:03:05 crc kubenswrapper[4574]: I1004 05:03:05.314371 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:03:05 crc kubenswrapper[4574]: I1004 05:03:05.318270 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 04 05:03:05 crc kubenswrapper[4574]: I1004 05:03:05.907841 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 04 05:03:06 crc kubenswrapper[4574]: W1004 05:03:06.060652 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58b5f9d7_7329_4c3e_a7f6_fce81c9e7cb3.slice/crio-4f7be89d3b6ea21d78cf47c4ec227981acda51beb7732b1328963124f9684648 WatchSource:0}: Error finding container 4f7be89d3b6ea21d78cf47c4ec227981acda51beb7732b1328963124f9684648: Status 404 returned error can't find the container with id 4f7be89d3b6ea21d78cf47c4ec227981acda51beb7732b1328963124f9684648 Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.065953 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.066101 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-xbt7f_openstack(077e1f13-f5ea-4812-b14a-cf42ec68bb53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.068402 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" podUID="077e1f13-f5ea-4812-b14a-cf42ec68bb53" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.109949 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.110088 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99hj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-twlxf_openstack(2ab095f1-eeec-4911-bc7b-35acc57e729c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.111260 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" podUID="2ab095f1-eeec-4911-bc7b-35acc57e729c" Oct 04 05:03:06 crc kubenswrapper[4574]: I1004 05:03:06.120307 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339","Type":"ContainerStarted","Data":"c35f8326c2d5c469e5ee052988be4c903d08b356313b8f8609f3d89d6fdc0088"} Oct 04 05:03:06 crc kubenswrapper[4574]: I1004 05:03:06.124321 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3","Type":"ContainerStarted","Data":"4f7be89d3b6ea21d78cf47c4ec227981acda51beb7732b1328963124f9684648"} Oct 04 05:03:06 crc kubenswrapper[4574]: I1004 05:03:06.127880 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f275e3ec-6c93-412b-875c-65b03a785dc0","Type":"ContainerStarted","Data":"c3a84142a084374b83915838ec96724909b0475d7f25526a17296eeaa3305199"} Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.136826 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.136960 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4hx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6gb9q_openstack(8150a80a-d8b5-481a-a50f-5b04fc35dcc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:03:06 crc kubenswrapper[4574]: I1004 05:03:06.137560 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cd1d5524-8818-4988-9969-45c2f2904fb4","Type":"ContainerStarted","Data":"5d2fa8f552634d36c1f7a63fea6fc9b12c2b10eab40d4aa466dffd2b1fb42771"} Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.138035 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" podUID="8150a80a-d8b5-481a-a50f-5b04fc35dcc3" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.139503 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" podUID="077e1f13-f5ea-4812-b14a-cf42ec68bb53" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.200546 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.200801 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mr6c7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-ppq64_openstack(724298ee-77dc-4d83-a3e5-24d40041670c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:03:06 crc kubenswrapper[4574]: E1004 05:03:06.206593 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" podUID="724298ee-77dc-4d83-a3e5-24d40041670c" Oct 04 05:03:06 crc kubenswrapper[4574]: I1004 05:03:06.639039 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-khsmk"] Oct 04 05:03:06 crc kubenswrapper[4574]: W1004 05:03:06.670471 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3836030c_f0c4_4392_bc54_cc817fd89934.slice/crio-d98b7ddfcb19d5a0e8efbaa99a2ef72d21c73599868e87c0ec2f21e5db7fba54 WatchSource:0}: Error finding container d98b7ddfcb19d5a0e8efbaa99a2ef72d21c73599868e87c0ec2f21e5db7fba54: Status 404 returned error can't find the container with id d98b7ddfcb19d5a0e8efbaa99a2ef72d21c73599868e87c0ec2f21e5db7fba54 Oct 04 05:03:06 crc kubenswrapper[4574]: I1004 05:03:06.847400 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gl29s"] Oct 04 05:03:06 crc kubenswrapper[4574]: W1004 05:03:06.904899 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12012c68_85d3_4063_90d2_b80d4d169f38.slice/crio-263d43cc865828364d19b6960f613f7e95b0eaba5bdf512fed68fc9e0ec81b6f WatchSource:0}: Error finding container 263d43cc865828364d19b6960f613f7e95b0eaba5bdf512fed68fc9e0ec81b6f: Status 404 returned error can't find the container with id 263d43cc865828364d19b6960f613f7e95b0eaba5bdf512fed68fc9e0ec81b6f Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.145096 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk" event={"ID":"3836030c-f0c4-4392-bc54-cc817fd89934","Type":"ContainerStarted","Data":"d98b7ddfcb19d5a0e8efbaa99a2ef72d21c73599868e87c0ec2f21e5db7fba54"} Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.146699 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gl29s" event={"ID":"12012c68-85d3-4063-90d2-b80d4d169f38","Type":"ContainerStarted","Data":"263d43cc865828364d19b6960f613f7e95b0eaba5bdf512fed68fc9e0ec81b6f"} Oct 04 05:03:07 crc kubenswrapper[4574]: E1004 05:03:07.148546 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" podUID="2ab095f1-eeec-4911-bc7b-35acc57e729c" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.865151 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.877611 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.894547 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.911160 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-config\") pod \"724298ee-77dc-4d83-a3e5-24d40041670c\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.911254 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4hx5\" (UniqueName: \"kubernetes.io/projected/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-kube-api-access-f4hx5\") pod \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.911320 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-dns-svc\") pod \"724298ee-77dc-4d83-a3e5-24d40041670c\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.911362 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-config\") pod \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\" (UID: \"8150a80a-d8b5-481a-a50f-5b04fc35dcc3\") " Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.911401 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr6c7\" (UniqueName: \"kubernetes.io/projected/724298ee-77dc-4d83-a3e5-24d40041670c-kube-api-access-mr6c7\") pod \"724298ee-77dc-4d83-a3e5-24d40041670c\" (UID: \"724298ee-77dc-4d83-a3e5-24d40041670c\") " Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.911832 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-config" (OuterVolumeSpecName: "config") pod "724298ee-77dc-4d83-a3e5-24d40041670c" (UID: "724298ee-77dc-4d83-a3e5-24d40041670c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.912096 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-config" (OuterVolumeSpecName: "config") pod "8150a80a-d8b5-481a-a50f-5b04fc35dcc3" (UID: "8150a80a-d8b5-481a-a50f-5b04fc35dcc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.912320 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "724298ee-77dc-4d83-a3e5-24d40041670c" (UID: "724298ee-77dc-4d83-a3e5-24d40041670c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.918397 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724298ee-77dc-4d83-a3e5-24d40041670c-kube-api-access-mr6c7" (OuterVolumeSpecName: "kube-api-access-mr6c7") pod "724298ee-77dc-4d83-a3e5-24d40041670c" (UID: "724298ee-77dc-4d83-a3e5-24d40041670c"). InnerVolumeSpecName "kube-api-access-mr6c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:07 crc kubenswrapper[4574]: I1004 05:03:07.918497 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-kube-api-access-f4hx5" (OuterVolumeSpecName: "kube-api-access-f4hx5") pod "8150a80a-d8b5-481a-a50f-5b04fc35dcc3" (UID: "8150a80a-d8b5-481a-a50f-5b04fc35dcc3"). InnerVolumeSpecName "kube-api-access-f4hx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.013659 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr6c7\" (UniqueName: \"kubernetes.io/projected/724298ee-77dc-4d83-a3e5-24d40041670c-kube-api-access-mr6c7\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.013701 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.013711 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4hx5\" (UniqueName: \"kubernetes.io/projected/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-kube-api-access-f4hx5\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.013720 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/724298ee-77dc-4d83-a3e5-24d40041670c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.013729 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8150a80a-d8b5-481a-a50f-5b04fc35dcc3-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:08 crc kubenswrapper[4574]: W1004 05:03:08.017555 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e5b7c0f_9b1c_411c_94b0_f57b8157c998.slice/crio-ecdf97ae0c77ccabbc353e1718b52fbc3e8d5963b849327a9495193802842a85 WatchSource:0}: Error finding container ecdf97ae0c77ccabbc353e1718b52fbc3e8d5963b849327a9495193802842a85: Status 404 returned error can't find the container with id ecdf97ae0c77ccabbc353e1718b52fbc3e8d5963b849327a9495193802842a85 Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.157840 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1e5b7c0f-9b1c-411c-94b0-f57b8157c998","Type":"ContainerStarted","Data":"ecdf97ae0c77ccabbc353e1718b52fbc3e8d5963b849327a9495193802842a85"} Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.161193 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"16df8292-9780-4212-a920-bf0eed95da87","Type":"ContainerStarted","Data":"2428074d47972d1f6fdd6c280ab98af22b8aa63b1019d3e79680f303071f5225"} Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.166996 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" event={"ID":"8150a80a-d8b5-481a-a50f-5b04fc35dcc3","Type":"ContainerDied","Data":"b452c22e9e2bdee2e31126b6e0f2ac72cce39c22c0adf0d39cee82c916ca1a6e"} Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.167001 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6gb9q" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.174897 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e3699c-e19d-4c38-b763-32af874a1a90","Type":"ContainerStarted","Data":"19e755d98857189714271accecdc264d16ad48dbf72fe80113eb003f0a2478ba"} Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.195698 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" event={"ID":"724298ee-77dc-4d83-a3e5-24d40041670c","Type":"ContainerDied","Data":"3ae659106de4bfbd2e38a79a5ca6dfdc96c2ad2eb8f5ed276ce67c913a4ff115"} Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.195811 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppq64" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.297265 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppq64"] Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.303075 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppq64"] Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.353881 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6gb9q"] Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.355648 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6gb9q"] Oct 04 05:03:08 crc kubenswrapper[4574]: E1004 05:03:08.439903 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8150a80a_d8b5_481a_a50f_5b04fc35dcc3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724298ee_77dc_4d83_a3e5_24d40041670c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724298ee_77dc_4d83_a3e5_24d40041670c.slice/crio-3ae659106de4bfbd2e38a79a5ca6dfdc96c2ad2eb8f5ed276ce67c913a4ff115\": RecentStats: unable to find data in memory cache]" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.743847 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724298ee-77dc-4d83-a3e5-24d40041670c" path="/var/lib/kubelet/pods/724298ee-77dc-4d83-a3e5-24d40041670c/volumes" Oct 04 05:03:08 crc kubenswrapper[4574]: I1004 05:03:08.745044 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8150a80a-d8b5-481a-a50f-5b04fc35dcc3" path="/var/lib/kubelet/pods/8150a80a-d8b5-481a-a50f-5b04fc35dcc3/volumes" Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.256924 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339","Type":"ContainerStarted","Data":"9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4"} Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.257777 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.264529 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f275e3ec-6c93-412b-875c-65b03a785dc0","Type":"ContainerStarted","Data":"4da1edb814f071fc89506afb0cc6e2b1fe6fa77ac0cb7b235473779a22d4c8c1"} Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.285324 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.553535781 podStartE2EDuration="25.28530445s" podCreationTimestamp="2025-10-04 05:02:49 +0000 UTC" firstStartedPulling="2025-10-04 05:03:06.08060989 +0000 UTC m=+1011.934752932" lastFinishedPulling="2025-10-04 05:03:13.812378559 +0000 UTC m=+1019.666521601" observedRunningTime="2025-10-04 05:03:14.274792089 +0000 UTC m=+1020.128935131" watchObservedRunningTime="2025-10-04 05:03:14.28530445 +0000 UTC m=+1020.139447482" Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.290584 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3","Type":"ContainerStarted","Data":"67dc261a4c9bbd2d44d5ad27626ab097e6a0fb2d7892b81cf7f50edd8f6a90f8"} Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.291413 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.308393 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c862e2a0-256a-470f-b35b-c244555f0c5f","Type":"ContainerStarted","Data":"6323aab81d22f6d9ca61a64da04f11dbd1ff75a1ee1e590a149d53b463840961"} Oct 04 05:03:14 crc kubenswrapper[4574]: I1004 05:03:14.356732 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.037094628 podStartE2EDuration="27.356711478s" podCreationTimestamp="2025-10-04 05:02:47 +0000 UTC" firstStartedPulling="2025-10-04 05:03:06.080795476 +0000 UTC m=+1011.934938518" lastFinishedPulling="2025-10-04 05:03:13.400412326 +0000 UTC m=+1019.254555368" observedRunningTime="2025-10-04 05:03:14.326618735 +0000 UTC m=+1020.180761777" watchObservedRunningTime="2025-10-04 05:03:14.356711478 +0000 UTC m=+1020.210854520" Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.338532 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk" event={"ID":"3836030c-f0c4-4392-bc54-cc817fd89934","Type":"ContainerStarted","Data":"872b7d102952cf53dfad120fb77a0a8decbd7f0d1bcdebeae65fff7275c55c1f"} Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.338911 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-khsmk" Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.341945 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1e5b7c0f-9b1c-411c-94b0-f57b8157c998","Type":"ContainerStarted","Data":"b7ca623b984901c308363c0d9bb7dd02e6d7bb7dce1745846a0744222e1f92ee"} Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.347228 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cd1d5524-8818-4988-9969-45c2f2904fb4","Type":"ContainerStarted","Data":"ebc06342da4e1ebf4c7eb1ded89256a5cbec42e53a6ce3a8cd24ca36b99de079"} Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.350423 4574 generic.go:334] "Generic (PLEG): container finished" podID="12012c68-85d3-4063-90d2-b80d4d169f38" containerID="e28484550959e521b98b54d302cda74d89c3fad28110a5a6541b9eedf96fb091" exitCode=0 Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.350549 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gl29s" event={"ID":"12012c68-85d3-4063-90d2-b80d4d169f38","Type":"ContainerDied","Data":"e28484550959e521b98b54d302cda74d89c3fad28110a5a6541b9eedf96fb091"} Oct 04 05:03:15 crc kubenswrapper[4574]: I1004 05:03:15.360295 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-khsmk" podStartSLOduration=15.20857462 podStartE2EDuration="22.360278925s" podCreationTimestamp="2025-10-04 05:02:53 +0000 UTC" firstStartedPulling="2025-10-04 05:03:06.674697596 +0000 UTC m=+1012.528840638" lastFinishedPulling="2025-10-04 05:03:13.826401901 +0000 UTC m=+1019.680544943" observedRunningTime="2025-10-04 05:03:15.357381202 +0000 UTC m=+1021.211524254" watchObservedRunningTime="2025-10-04 05:03:15.360278925 +0000 UTC m=+1021.214421967" Oct 04 05:03:16 crc kubenswrapper[4574]: I1004 05:03:16.358918 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gl29s" event={"ID":"12012c68-85d3-4063-90d2-b80d4d169f38","Type":"ContainerStarted","Data":"4194538e56939c74ecdb457605fcad96cc15c8d7956ef96b5156b9f934bb58bf"} Oct 04 05:03:16 crc kubenswrapper[4574]: I1004 05:03:16.359264 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gl29s" event={"ID":"12012c68-85d3-4063-90d2-b80d4d169f38","Type":"ContainerStarted","Data":"b8e5b935a577048513961ceca1a6d39b0f81a04b6e7bcd61f78da7fe9e1f28c2"} Oct 04 05:03:16 crc kubenswrapper[4574]: I1004 05:03:16.359284 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:03:16 crc kubenswrapper[4574]: I1004 05:03:16.359299 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:03:16 crc kubenswrapper[4574]: I1004 05:03:16.381874 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gl29s" podStartSLOduration=16.562537855 podStartE2EDuration="23.381858029s" podCreationTimestamp="2025-10-04 05:02:53 +0000 UTC" firstStartedPulling="2025-10-04 05:03:06.909705525 +0000 UTC m=+1012.763848567" lastFinishedPulling="2025-10-04 05:03:13.729025689 +0000 UTC m=+1019.583168741" observedRunningTime="2025-10-04 05:03:16.375243029 +0000 UTC m=+1022.229386071" watchObservedRunningTime="2025-10-04 05:03:16.381858029 +0000 UTC m=+1022.236001071" Oct 04 05:03:18 crc kubenswrapper[4574]: I1004 05:03:18.197397 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.377887 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.378437 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xbt7f"] Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.445217 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1e5b7c0f-9b1c-411c-94b0-f57b8157c998","Type":"ContainerStarted","Data":"d83463499f8d30dbf236ccb5defbdb10483d6f037994576ed9ad1461b03ca7dd"} Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.451990 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"cd1d5524-8818-4988-9969-45c2f2904fb4","Type":"ContainerStarted","Data":"7f9bc7d2d6c8e7498e3d2be1ccb95383404766819be8b0209c3d6f9d9c00eb52"} Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.465577 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-j9gf7"] Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.473258 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.506316 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-j9gf7"] Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.512014 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.007649384 podStartE2EDuration="29.511996661s" podCreationTimestamp="2025-10-04 05:02:51 +0000 UTC" firstStartedPulling="2025-10-04 05:03:06.080570499 +0000 UTC m=+1011.934713541" lastFinishedPulling="2025-10-04 05:03:19.584917776 +0000 UTC m=+1025.439060818" observedRunningTime="2025-10-04 05:03:20.511216558 +0000 UTC m=+1026.365359600" watchObservedRunningTime="2025-10-04 05:03:20.511996661 +0000 UTC m=+1026.366139693" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.567831 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.568060 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mgw\" (UniqueName: \"kubernetes.io/projected/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-kube-api-access-84mgw\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.568090 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-config\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.596892 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.025817725 podStartE2EDuration="25.596872135s" podCreationTimestamp="2025-10-04 05:02:55 +0000 UTC" firstStartedPulling="2025-10-04 05:03:08.022577706 +0000 UTC m=+1013.876720748" lastFinishedPulling="2025-10-04 05:03:19.593632116 +0000 UTC m=+1025.447775158" observedRunningTime="2025-10-04 05:03:20.560730568 +0000 UTC m=+1026.414873610" watchObservedRunningTime="2025-10-04 05:03:20.596872135 +0000 UTC m=+1026.451015177" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.670172 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.670362 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84mgw\" (UniqueName: \"kubernetes.io/projected/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-kube-api-access-84mgw\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.670395 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-config\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.671639 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.672044 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-config\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.692777 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84mgw\" (UniqueName: \"kubernetes.io/projected/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-kube-api-access-84mgw\") pod \"dnsmasq-dns-7cb5889db5-j9gf7\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.819669 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:20 crc kubenswrapper[4574]: I1004 05:03:20.970049 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.076668 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-dns-svc\") pod \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.076776 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cltq\" (UniqueName: \"kubernetes.io/projected/077e1f13-f5ea-4812-b14a-cf42ec68bb53-kube-api-access-8cltq\") pod \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.076833 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-config\") pod \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\" (UID: \"077e1f13-f5ea-4812-b14a-cf42ec68bb53\") " Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.077677 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "077e1f13-f5ea-4812-b14a-cf42ec68bb53" (UID: "077e1f13-f5ea-4812-b14a-cf42ec68bb53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.077752 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-config" (OuterVolumeSpecName: "config") pod "077e1f13-f5ea-4812-b14a-cf42ec68bb53" (UID: "077e1f13-f5ea-4812-b14a-cf42ec68bb53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.091182 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077e1f13-f5ea-4812-b14a-cf42ec68bb53-kube-api-access-8cltq" (OuterVolumeSpecName: "kube-api-access-8cltq") pod "077e1f13-f5ea-4812-b14a-cf42ec68bb53" (UID: "077e1f13-f5ea-4812-b14a-cf42ec68bb53"). InnerVolumeSpecName "kube-api-access-8cltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.179394 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.179431 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cltq\" (UniqueName: \"kubernetes.io/projected/077e1f13-f5ea-4812-b14a-cf42ec68bb53-kube-api-access-8cltq\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.179444 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/077e1f13-f5ea-4812-b14a-cf42ec68bb53-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.251707 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.294906 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.385223 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-j9gf7"] Oct 04 05:03:21 crc kubenswrapper[4574]: W1004 05:03:21.387295 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c70fc86_1d9c_43d4_aa53_1881f72d56b6.slice/crio-424a747444e9b67e7c51e36f1523ce42f6055ab67997819c06b8739a37e9131f WatchSource:0}: Error finding container 424a747444e9b67e7c51e36f1523ce42f6055ab67997819c06b8739a37e9131f: Status 404 returned error can't find the container with id 424a747444e9b67e7c51e36f1523ce42f6055ab67997819c06b8739a37e9131f Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.460668 4574 generic.go:334] "Generic (PLEG): container finished" podID="f275e3ec-6c93-412b-875c-65b03a785dc0" containerID="4da1edb814f071fc89506afb0cc6e2b1fe6fa77ac0cb7b235473779a22d4c8c1" exitCode=0 Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.460740 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f275e3ec-6c93-412b-875c-65b03a785dc0","Type":"ContainerDied","Data":"4da1edb814f071fc89506afb0cc6e2b1fe6fa77ac0cb7b235473779a22d4c8c1"} Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.462891 4574 generic.go:334] "Generic (PLEG): container finished" podID="c862e2a0-256a-470f-b35b-c244555f0c5f" containerID="6323aab81d22f6d9ca61a64da04f11dbd1ff75a1ee1e590a149d53b463840961" exitCode=0 Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.463073 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c862e2a0-256a-470f-b35b-c244555f0c5f","Type":"ContainerDied","Data":"6323aab81d22f6d9ca61a64da04f11dbd1ff75a1ee1e590a149d53b463840961"} Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.464714 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" event={"ID":"077e1f13-f5ea-4812-b14a-cf42ec68bb53","Type":"ContainerDied","Data":"deecbe12a2aca0bec3f2dea8786b8c0fcd04dd6c27442113354f56dd829df4ab"} Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.464851 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-xbt7f" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.468284 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" event={"ID":"2c70fc86-1d9c-43d4-aa53-1881f72d56b6","Type":"ContainerStarted","Data":"424a747444e9b67e7c51e36f1523ce42f6055ab67997819c06b8739a37e9131f"} Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.468333 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.525011 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.586304 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xbt7f"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.607496 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-xbt7f"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.652630 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.760499 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.763554 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.770346 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.770692 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.770361 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dqt7r" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.771029 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.836643 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529th\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-kube-api-access-529th\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.836862 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.837006 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.837154 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/74b762df-991e-4e0c-9be6-c3e468408254-lock\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.838161 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/74b762df-991e-4e0c-9be6-c3e468408254-cache\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.841755 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-twlxf"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.889651 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vdgfn"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.891623 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.900560 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.910954 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vdgfn"] Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940226 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940492 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-529th\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-kube-api-access-529th\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940606 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940687 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940797 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940900 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/74b762df-991e-4e0c-9be6-c3e468408254-lock\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.940990 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-config\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.941067 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmddc\" (UniqueName: \"kubernetes.io/projected/15175a86-4332-4519-a593-3914d0686e66-kube-api-access-nmddc\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.941181 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/74b762df-991e-4e0c-9be6-c3e468408254-cache\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.941476 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/74b762df-991e-4e0c-9be6-c3e468408254-lock\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: E1004 05:03:21.941575 4574 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 04 05:03:21 crc kubenswrapper[4574]: E1004 05:03:21.941640 4574 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 04 05:03:21 crc kubenswrapper[4574]: E1004 05:03:21.941681 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift podName:74b762df-991e-4e0c-9be6-c3e468408254 nodeName:}" failed. No retries permitted until 2025-10-04 05:03:22.441666728 +0000 UTC m=+1028.295809770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift") pod "swift-storage-0" (UID: "74b762df-991e-4e0c-9be6-c3e468408254") : configmap "swift-ring-files" not found Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.941101 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.942027 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/74b762df-991e-4e0c-9be6-c3e468408254-cache\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.966430 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-529th\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-kube-api-access-529th\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:21 crc kubenswrapper[4574]: I1004 05:03:21.971775 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.004172 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4576m"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.005321 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.012833 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.027263 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4576m"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043702 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5xq\" (UniqueName: \"kubernetes.io/projected/fb659229-980c-4368-a799-f0db3f3330da-kube-api-access-rr5xq\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043787 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-config\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043812 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmddc\" (UniqueName: \"kubernetes.io/projected/15175a86-4332-4519-a593-3914d0686e66-kube-api-access-nmddc\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043839 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb659229-980c-4368-a799-f0db3f3330da-combined-ca-bundle\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043870 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb659229-980c-4368-a799-f0db3f3330da-config\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043893 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb659229-980c-4368-a799-f0db3f3330da-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043917 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb659229-980c-4368-a799-f0db3f3330da-ovn-rundir\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043936 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb659229-980c-4368-a799-f0db3f3330da-ovs-rundir\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043955 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.043981 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.044891 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.044913 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.045206 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-config\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.069984 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmddc\" (UniqueName: \"kubernetes.io/projected/15175a86-4332-4519-a593-3914d0686e66-kube-api-access-nmddc\") pod \"dnsmasq-dns-6c89d5d749-vdgfn\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.145974 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5xq\" (UniqueName: \"kubernetes.io/projected/fb659229-980c-4368-a799-f0db3f3330da-kube-api-access-rr5xq\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.146040 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb659229-980c-4368-a799-f0db3f3330da-combined-ca-bundle\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.146081 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb659229-980c-4368-a799-f0db3f3330da-config\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.146101 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb659229-980c-4368-a799-f0db3f3330da-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.146142 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb659229-980c-4368-a799-f0db3f3330da-ovn-rundir\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.146162 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb659229-980c-4368-a799-f0db3f3330da-ovs-rundir\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.146506 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb659229-980c-4368-a799-f0db3f3330da-ovs-rundir\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.147055 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb659229-980c-4368-a799-f0db3f3330da-ovn-rundir\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.148021 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb659229-980c-4368-a799-f0db3f3330da-config\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.150317 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb659229-980c-4368-a799-f0db3f3330da-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.172815 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb659229-980c-4368-a799-f0db3f3330da-combined-ca-bundle\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.172862 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5xq\" (UniqueName: \"kubernetes.io/projected/fb659229-980c-4368-a799-f0db3f3330da-kube-api-access-rr5xq\") pod \"ovn-controller-metrics-4576m\" (UID: \"fb659229-980c-4368-a799-f0db3f3330da\") " pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.208748 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.215216 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gbs8h"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.216557 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.221726 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.221874 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.222094 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.236768 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gbs8h"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247583 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65ae5a48-3442-4149-9dbd-ac23191fa438-etc-swift\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247623 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-swiftconf\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247662 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skm7w\" (UniqueName: \"kubernetes.io/projected/65ae5a48-3442-4149-9dbd-ac23191fa438-kube-api-access-skm7w\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247702 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-dispersionconf\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247723 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-scripts\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247751 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-ring-data-devices\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.247775 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-combined-ca-bundle\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.298401 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-j9gf7"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.322826 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-b7wnz"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.329957 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.335847 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.340227 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b7wnz"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.348946 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skm7w\" (UniqueName: \"kubernetes.io/projected/65ae5a48-3442-4149-9dbd-ac23191fa438-kube-api-access-skm7w\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.348995 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349038 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-dispersionconf\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349060 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-dns-svc\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349081 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-scripts\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349107 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-ring-data-devices\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349122 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-combined-ca-bundle\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349158 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-config\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349187 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb7kd\" (UniqueName: \"kubernetes.io/projected/618c6c39-89f8-45ee-b9df-753294b5cfeb-kube-api-access-kb7kd\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349210 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349267 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65ae5a48-3442-4149-9dbd-ac23191fa438-etc-swift\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.349288 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-swiftconf\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.351686 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-ring-data-devices\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.352521 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65ae5a48-3442-4149-9dbd-ac23191fa438-etc-swift\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.352953 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-scripts\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.363581 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4576m" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.367038 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-swiftconf\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.371848 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-combined-ca-bundle\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.379175 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-dispersionconf\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.421983 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skm7w\" (UniqueName: \"kubernetes.io/projected/65ae5a48-3442-4149-9dbd-ac23191fa438-kube-api-access-skm7w\") pod \"swift-ring-rebalance-gbs8h\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.451940 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-config\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.452008 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb7kd\" (UniqueName: \"kubernetes.io/projected/618c6c39-89f8-45ee-b9df-753294b5cfeb-kube-api-access-kb7kd\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.452051 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.452142 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.452272 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.452368 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-dns-svc\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.453437 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: E1004 05:03:22.455295 4574 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 04 05:03:22 crc kubenswrapper[4574]: E1004 05:03:22.455317 4574 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 04 05:03:22 crc kubenswrapper[4574]: E1004 05:03:22.455356 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift podName:74b762df-991e-4e0c-9be6-c3e468408254 nodeName:}" failed. No retries permitted until 2025-10-04 05:03:23.455342816 +0000 UTC m=+1029.309485858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift") pod "swift-storage-0" (UID: "74b762df-991e-4e0c-9be6-c3e468408254") : configmap "swift-ring-files" not found Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.457047 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-config\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.458024 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-dns-svc\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.458574 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.480298 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb7kd\" (UniqueName: \"kubernetes.io/projected/618c6c39-89f8-45ee-b9df-753294b5cfeb-kube-api-access-kb7kd\") pod \"dnsmasq-dns-698758b865-b7wnz\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.488041 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f275e3ec-6c93-412b-875c-65b03a785dc0","Type":"ContainerStarted","Data":"b5d2027cc29b503ca128bb58127e27059660148975f501c411bd8192f045cb19"} Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.507952 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c862e2a0-256a-470f-b35b-c244555f0c5f","Type":"ContainerStarted","Data":"7f902061025bfbcf1ab7527508c499cbe43e3c0a55495c0b3a98fc7559acf9b5"} Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.512860 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.748007408 podStartE2EDuration="36.512840885s" podCreationTimestamp="2025-10-04 05:02:46 +0000 UTC" firstStartedPulling="2025-10-04 05:03:06.062672876 +0000 UTC m=+1011.916815918" lastFinishedPulling="2025-10-04 05:03:13.827506353 +0000 UTC m=+1019.681649395" observedRunningTime="2025-10-04 05:03:22.509507269 +0000 UTC m=+1028.363650311" watchObservedRunningTime="2025-10-04 05:03:22.512840885 +0000 UTC m=+1028.366983927" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.539997 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.119149626 podStartE2EDuration="36.539976933s" podCreationTimestamp="2025-10-04 05:02:46 +0000 UTC" firstStartedPulling="2025-10-04 05:02:59.307163462 +0000 UTC m=+1005.161306514" lastFinishedPulling="2025-10-04 05:03:13.727990779 +0000 UTC m=+1019.582133821" observedRunningTime="2025-10-04 05:03:22.537515983 +0000 UTC m=+1028.391659025" watchObservedRunningTime="2025-10-04 05:03:22.539976933 +0000 UTC m=+1028.394119975" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.550072 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.669132 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.752478 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077e1f13-f5ea-4812-b14a-cf42ec68bb53" path="/var/lib/kubelet/pods/077e1f13-f5ea-4812-b14a-cf42ec68bb53/volumes" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.909770 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vdgfn"] Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.918523 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 04 05:03:22 crc kubenswrapper[4574]: I1004 05:03:22.924231 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.006064 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.019116 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4576m"] Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.279603 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gbs8h"] Oct 04 05:03:23 crc kubenswrapper[4574]: W1004 05:03:23.287299 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ae5a48_3442_4149_9dbd_ac23191fa438.slice/crio-b3320748261642326f52265025407b474b05fd28d719e1798845ba9b8df5a7a3 WatchSource:0}: Error finding container b3320748261642326f52265025407b474b05fd28d719e1798845ba9b8df5a7a3: Status 404 returned error can't find the container with id b3320748261642326f52265025407b474b05fd28d719e1798845ba9b8df5a7a3 Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.371044 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b7wnz"] Oct 04 05:03:23 crc kubenswrapper[4574]: W1004 05:03:23.386454 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod618c6c39_89f8_45ee_b9df_753294b5cfeb.slice/crio-cdb2ac831153cbcaf370d6bee5406a4efbfdb6d5657a7b4f067a3204ad2cc76f WatchSource:0}: Error finding container cdb2ac831153cbcaf370d6bee5406a4efbfdb6d5657a7b4f067a3204ad2cc76f: Status 404 returned error can't find the container with id cdb2ac831153cbcaf370d6bee5406a4efbfdb6d5657a7b4f067a3204ad2cc76f Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.484049 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:23 crc kubenswrapper[4574]: E1004 05:03:23.484284 4574 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 04 05:03:23 crc kubenswrapper[4574]: E1004 05:03:23.484302 4574 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 04 05:03:23 crc kubenswrapper[4574]: E1004 05:03:23.484358 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift podName:74b762df-991e-4e0c-9be6-c3e468408254 nodeName:}" failed. No retries permitted until 2025-10-04 05:03:25.484340063 +0000 UTC m=+1031.338483105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift") pod "swift-storage-0" (UID: "74b762df-991e-4e0c-9be6-c3e468408254") : configmap "swift-ring-files" not found Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.517359 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b7wnz" event={"ID":"618c6c39-89f8-45ee-b9df-753294b5cfeb","Type":"ContainerStarted","Data":"cdb2ac831153cbcaf370d6bee5406a4efbfdb6d5657a7b4f067a3204ad2cc76f"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.518129 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gbs8h" event={"ID":"65ae5a48-3442-4149-9dbd-ac23191fa438","Type":"ContainerStarted","Data":"b3320748261642326f52265025407b474b05fd28d719e1798845ba9b8df5a7a3"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.519342 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4576m" event={"ID":"fb659229-980c-4368-a799-f0db3f3330da","Type":"ContainerStarted","Data":"2f2ea35e9b4a8745114b11fc27850862f7b601fdf1b5cf07430aabafd3b45395"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.519382 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4576m" event={"ID":"fb659229-980c-4368-a799-f0db3f3330da","Type":"ContainerStarted","Data":"f23f0f9c6116773f66e9cbb66288ebacc105fd0e51dfe06764a8d76b0e9ab146"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.520706 4574 generic.go:334] "Generic (PLEG): container finished" podID="15175a86-4332-4519-a593-3914d0686e66" containerID="b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3" exitCode=0 Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.520767 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" event={"ID":"15175a86-4332-4519-a593-3914d0686e66","Type":"ContainerDied","Data":"b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.520817 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" event={"ID":"15175a86-4332-4519-a593-3914d0686e66","Type":"ContainerStarted","Data":"35ce0da653b5a3df1e2d185791c22313507638adf2dfe4c7d4cdc9ce4ac4fd23"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.524562 4574 generic.go:334] "Generic (PLEG): container finished" podID="2ab095f1-eeec-4911-bc7b-35acc57e729c" containerID="dff2f915e5fc5d345ec1aa11b928264161068e7f734fea5f9fb2d996d328a672" exitCode=0 Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.524662 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" event={"ID":"2ab095f1-eeec-4911-bc7b-35acc57e729c","Type":"ContainerDied","Data":"dff2f915e5fc5d345ec1aa11b928264161068e7f734fea5f9fb2d996d328a672"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.533554 4574 generic.go:334] "Generic (PLEG): container finished" podID="2c70fc86-1d9c-43d4-aa53-1881f72d56b6" containerID="75085d17484289f9102f7ee3fe6e88b19252aea2d99d657f8d84c162557553c4" exitCode=0 Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.534943 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" event={"ID":"2c70fc86-1d9c-43d4-aa53-1881f72d56b6","Type":"ContainerDied","Data":"75085d17484289f9102f7ee3fe6e88b19252aea2d99d657f8d84c162557553c4"} Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.561561 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4576m" podStartSLOduration=2.561543207 podStartE2EDuration="2.561543207s" podCreationTimestamp="2025-10-04 05:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:03:23.558888971 +0000 UTC m=+1029.413032023" watchObservedRunningTime="2025-10-04 05:03:23.561543207 +0000 UTC m=+1029.415686259" Oct 04 05:03:23 crc kubenswrapper[4574]: I1004 05:03:23.702459 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.092218 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.114506 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.122225 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.135406 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.135577 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.135615 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mv7vq" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.135789 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.164835 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.170737 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.210734 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84mgw\" (UniqueName: \"kubernetes.io/projected/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-kube-api-access-84mgw\") pod \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.210985 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99hj4\" (UniqueName: \"kubernetes.io/projected/2ab095f1-eeec-4911-bc7b-35acc57e729c-kube-api-access-99hj4\") pod \"2ab095f1-eeec-4911-bc7b-35acc57e729c\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.211128 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-config\") pod \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.211363 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-config\") pod \"2ab095f1-eeec-4911-bc7b-35acc57e729c\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.211567 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-dns-svc\") pod \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\" (UID: \"2c70fc86-1d9c-43d4-aa53-1881f72d56b6\") " Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.211708 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-dns-svc\") pod \"2ab095f1-eeec-4911-bc7b-35acc57e729c\" (UID: \"2ab095f1-eeec-4911-bc7b-35acc57e729c\") " Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.211960 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nxht\" (UniqueName: \"kubernetes.io/projected/3e306e34-ea03-4a60-9adc-99f30618be02-kube-api-access-2nxht\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.212058 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.212155 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.212298 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e306e34-ea03-4a60-9adc-99f30618be02-scripts\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.212377 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e306e34-ea03-4a60-9adc-99f30618be02-config\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.212879 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e306e34-ea03-4a60-9adc-99f30618be02-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.213177 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.221579 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab095f1-eeec-4911-bc7b-35acc57e729c-kube-api-access-99hj4" (OuterVolumeSpecName: "kube-api-access-99hj4") pod "2ab095f1-eeec-4911-bc7b-35acc57e729c" (UID: "2ab095f1-eeec-4911-bc7b-35acc57e729c"). InnerVolumeSpecName "kube-api-access-99hj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.236384 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c70fc86-1d9c-43d4-aa53-1881f72d56b6" (UID: "2c70fc86-1d9c-43d4-aa53-1881f72d56b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.243939 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-config" (OuterVolumeSpecName: "config") pod "2ab095f1-eeec-4911-bc7b-35acc57e729c" (UID: "2ab095f1-eeec-4911-bc7b-35acc57e729c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.249347 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-config" (OuterVolumeSpecName: "config") pod "2c70fc86-1d9c-43d4-aa53-1881f72d56b6" (UID: "2c70fc86-1d9c-43d4-aa53-1881f72d56b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.260884 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-kube-api-access-84mgw" (OuterVolumeSpecName: "kube-api-access-84mgw") pod "2c70fc86-1d9c-43d4-aa53-1881f72d56b6" (UID: "2c70fc86-1d9c-43d4-aa53-1881f72d56b6"). InnerVolumeSpecName "kube-api-access-84mgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.263815 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ab095f1-eeec-4911-bc7b-35acc57e729c" (UID: "2ab095f1-eeec-4911-bc7b-35acc57e729c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.315025 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e306e34-ea03-4a60-9adc-99f30618be02-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.315763 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.315988 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nxht\" (UniqueName: \"kubernetes.io/projected/3e306e34-ea03-4a60-9adc-99f30618be02-kube-api-access-2nxht\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.315712 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e306e34-ea03-4a60-9adc-99f30618be02-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.316166 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.316299 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.316437 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e306e34-ea03-4a60-9adc-99f30618be02-scripts\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.316530 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e306e34-ea03-4a60-9adc-99f30618be02-config\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.316942 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.317138 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.317635 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.318176 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab095f1-eeec-4911-bc7b-35acc57e729c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.318303 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84mgw\" (UniqueName: \"kubernetes.io/projected/2c70fc86-1d9c-43d4-aa53-1881f72d56b6-kube-api-access-84mgw\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.318404 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99hj4\" (UniqueName: \"kubernetes.io/projected/2ab095f1-eeec-4911-bc7b-35acc57e729c-kube-api-access-99hj4\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.318440 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e306e34-ea03-4a60-9adc-99f30618be02-config\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.317834 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e306e34-ea03-4a60-9adc-99f30618be02-scripts\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.327996 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.329010 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.332696 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e306e34-ea03-4a60-9adc-99f30618be02-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.340045 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nxht\" (UniqueName: \"kubernetes.io/projected/3e306e34-ea03-4a60-9adc-99f30618be02-kube-api-access-2nxht\") pod \"ovn-northd-0\" (UID: \"3e306e34-ea03-4a60-9adc-99f30618be02\") " pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.494665 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.548645 4574 generic.go:334] "Generic (PLEG): container finished" podID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerID="57ce78c2011e0774f4d945679ad0d85dfac8585f3033f9d235c9ac5112a3c7d8" exitCode=0 Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.548709 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b7wnz" event={"ID":"618c6c39-89f8-45ee-b9df-753294b5cfeb","Type":"ContainerDied","Data":"57ce78c2011e0774f4d945679ad0d85dfac8585f3033f9d235c9ac5112a3c7d8"} Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.552900 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" event={"ID":"15175a86-4332-4519-a593-3914d0686e66","Type":"ContainerStarted","Data":"1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856"} Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.553973 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.557057 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" event={"ID":"2ab095f1-eeec-4911-bc7b-35acc57e729c","Type":"ContainerDied","Data":"a52e844bc757afb26546c7f992831b673a698cfd43a0a7ff273a2d7068c585d5"} Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.557101 4574 scope.go:117] "RemoveContainer" containerID="dff2f915e5fc5d345ec1aa11b928264161068e7f734fea5f9fb2d996d328a672" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.557218 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-twlxf" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.599813 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" event={"ID":"2c70fc86-1d9c-43d4-aa53-1881f72d56b6","Type":"ContainerDied","Data":"424a747444e9b67e7c51e36f1523ce42f6055ab67997819c06b8739a37e9131f"} Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.599924 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-j9gf7" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.610566 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" podStartSLOduration=3.610543787 podStartE2EDuration="3.610543787s" podCreationTimestamp="2025-10-04 05:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:03:24.601016794 +0000 UTC m=+1030.455159836" watchObservedRunningTime="2025-10-04 05:03:24.610543787 +0000 UTC m=+1030.464686829" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.656035 4574 scope.go:117] "RemoveContainer" containerID="75085d17484289f9102f7ee3fe6e88b19252aea2d99d657f8d84c162557553c4" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.684032 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-twlxf"] Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.716500 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-twlxf"] Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.764339 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab095f1-eeec-4911-bc7b-35acc57e729c" path="/var/lib/kubelet/pods/2ab095f1-eeec-4911-bc7b-35acc57e729c/volumes" Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.764910 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-j9gf7"] Oct 04 05:03:24 crc kubenswrapper[4574]: I1004 05:03:24.773315 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-j9gf7"] Oct 04 05:03:25 crc kubenswrapper[4574]: I1004 05:03:25.126466 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 04 05:03:25 crc kubenswrapper[4574]: I1004 05:03:25.551612 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:25 crc kubenswrapper[4574]: E1004 05:03:25.551803 4574 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 04 05:03:25 crc kubenswrapper[4574]: E1004 05:03:25.552002 4574 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 04 05:03:25 crc kubenswrapper[4574]: E1004 05:03:25.552064 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift podName:74b762df-991e-4e0c-9be6-c3e468408254 nodeName:}" failed. No retries permitted until 2025-10-04 05:03:29.552045425 +0000 UTC m=+1035.406188467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift") pod "swift-storage-0" (UID: "74b762df-991e-4e0c-9be6-c3e468408254") : configmap "swift-ring-files" not found Oct 04 05:03:25 crc kubenswrapper[4574]: I1004 05:03:25.610719 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b7wnz" event={"ID":"618c6c39-89f8-45ee-b9df-753294b5cfeb","Type":"ContainerStarted","Data":"4253713f3e4e44ab75aa65bad2554e74dab2beeae4379a99ce9fd4bb2c0e374c"} Oct 04 05:03:25 crc kubenswrapper[4574]: I1004 05:03:25.610781 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:25 crc kubenswrapper[4574]: I1004 05:03:25.612123 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e306e34-ea03-4a60-9adc-99f30618be02","Type":"ContainerStarted","Data":"771dc4dd46e1b59984df2e2e5d0f6a99c8a7e0a101259504179fdfd24c9c12d8"} Oct 04 05:03:25 crc kubenswrapper[4574]: I1004 05:03:25.636224 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-b7wnz" podStartSLOduration=3.6362064480000003 podStartE2EDuration="3.636206448s" podCreationTimestamp="2025-10-04 05:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:03:25.633744718 +0000 UTC m=+1031.487887760" watchObservedRunningTime="2025-10-04 05:03:25.636206448 +0000 UTC m=+1031.490349490" Oct 04 05:03:26 crc kubenswrapper[4574]: I1004 05:03:26.744549 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c70fc86-1d9c-43d4-aa53-1881f72d56b6" path="/var/lib/kubelet/pods/2c70fc86-1d9c-43d4-aa53-1881f72d56b6/volumes" Oct 04 05:03:27 crc kubenswrapper[4574]: I1004 05:03:27.997222 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 04 05:03:27 crc kubenswrapper[4574]: I1004 05:03:27.998437 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 04 05:03:28 crc kubenswrapper[4574]: I1004 05:03:28.032133 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 04 05:03:28 crc kubenswrapper[4574]: I1004 05:03:28.032454 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 04 05:03:29 crc kubenswrapper[4574]: I1004 05:03:29.630620 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:29 crc kubenswrapper[4574]: E1004 05:03:29.630848 4574 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 04 05:03:29 crc kubenswrapper[4574]: E1004 05:03:29.631205 4574 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 04 05:03:29 crc kubenswrapper[4574]: E1004 05:03:29.631263 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift podName:74b762df-991e-4e0c-9be6-c3e468408254 nodeName:}" failed. No retries permitted until 2025-10-04 05:03:37.631248165 +0000 UTC m=+1043.485391207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift") pod "swift-storage-0" (UID: "74b762df-991e-4e0c-9be6-c3e468408254") : configmap "swift-ring-files" not found Oct 04 05:03:30 crc kubenswrapper[4574]: I1004 05:03:30.060900 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 04 05:03:30 crc kubenswrapper[4574]: I1004 05:03:30.117352 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.108406 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.158664 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.211408 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.669340 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gbs8h" event={"ID":"65ae5a48-3442-4149-9dbd-ac23191fa438","Type":"ContainerStarted","Data":"489be8b6719e7cf7f5899c10c92de956524f5023a189e3c2f2e85eb1b979ee38"} Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.671363 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.676472 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e306e34-ea03-4a60-9adc-99f30618be02","Type":"ContainerStarted","Data":"44f68b67afe6b66116339dc51dd988fc53650ed0d3dc12837380b9f1b0ed8664"} Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.676515 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e306e34-ea03-4a60-9adc-99f30618be02","Type":"ContainerStarted","Data":"6099b977e5a1d311bfdb4a39d9508c66d37dcffbffb35bba311bc897c10c0ef1"} Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.676532 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.690531 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gbs8h" podStartSLOduration=2.165303759 podStartE2EDuration="10.69051055s" podCreationTimestamp="2025-10-04 05:03:22 +0000 UTC" firstStartedPulling="2025-10-04 05:03:23.289963229 +0000 UTC m=+1029.144106271" lastFinishedPulling="2025-10-04 05:03:31.81517002 +0000 UTC m=+1037.669313062" observedRunningTime="2025-10-04 05:03:32.684039655 +0000 UTC m=+1038.538182697" watchObservedRunningTime="2025-10-04 05:03:32.69051055 +0000 UTC m=+1038.544653592" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.759923 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.013426325 podStartE2EDuration="8.75989911s" podCreationTimestamp="2025-10-04 05:03:24 +0000 UTC" firstStartedPulling="2025-10-04 05:03:25.111203273 +0000 UTC m=+1030.965346315" lastFinishedPulling="2025-10-04 05:03:31.857676058 +0000 UTC m=+1037.711819100" observedRunningTime="2025-10-04 05:03:32.756894364 +0000 UTC m=+1038.611037406" watchObservedRunningTime="2025-10-04 05:03:32.75989911 +0000 UTC m=+1038.614042152" Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.803767 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vdgfn"] Oct 04 05:03:32 crc kubenswrapper[4574]: I1004 05:03:32.804062 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" podUID="15175a86-4332-4519-a593-3914d0686e66" containerName="dnsmasq-dns" containerID="cri-o://1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856" gracePeriod=10 Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.296668 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.409282 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-ovsdbserver-sb\") pod \"15175a86-4332-4519-a593-3914d0686e66\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.409732 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmddc\" (UniqueName: \"kubernetes.io/projected/15175a86-4332-4519-a593-3914d0686e66-kube-api-access-nmddc\") pod \"15175a86-4332-4519-a593-3914d0686e66\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.409919 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-dns-svc\") pod \"15175a86-4332-4519-a593-3914d0686e66\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.409994 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-config\") pod \"15175a86-4332-4519-a593-3914d0686e66\" (UID: \"15175a86-4332-4519-a593-3914d0686e66\") " Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.415878 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15175a86-4332-4519-a593-3914d0686e66-kube-api-access-nmddc" (OuterVolumeSpecName: "kube-api-access-nmddc") pod "15175a86-4332-4519-a593-3914d0686e66" (UID: "15175a86-4332-4519-a593-3914d0686e66"). InnerVolumeSpecName "kube-api-access-nmddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.456546 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15175a86-4332-4519-a593-3914d0686e66" (UID: "15175a86-4332-4519-a593-3914d0686e66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.459800 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-config" (OuterVolumeSpecName: "config") pod "15175a86-4332-4519-a593-3914d0686e66" (UID: "15175a86-4332-4519-a593-3914d0686e66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.476131 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15175a86-4332-4519-a593-3914d0686e66" (UID: "15175a86-4332-4519-a593-3914d0686e66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.512191 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.512245 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmddc\" (UniqueName: \"kubernetes.io/projected/15175a86-4332-4519-a593-3914d0686e66-kube-api-access-nmddc\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.512263 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.512275 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15175a86-4332-4519-a593-3914d0686e66-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.683430 4574 generic.go:334] "Generic (PLEG): container finished" podID="15175a86-4332-4519-a593-3914d0686e66" containerID="1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856" exitCode=0 Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.683481 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.683491 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" event={"ID":"15175a86-4332-4519-a593-3914d0686e66","Type":"ContainerDied","Data":"1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856"} Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.683556 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vdgfn" event={"ID":"15175a86-4332-4519-a593-3914d0686e66","Type":"ContainerDied","Data":"35ce0da653b5a3df1e2d185791c22313507638adf2dfe4c7d4cdc9ce4ac4fd23"} Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.683578 4574 scope.go:117] "RemoveContainer" containerID="1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.708304 4574 scope.go:117] "RemoveContainer" containerID="b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.715113 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vdgfn"] Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.721567 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vdgfn"] Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.736216 4574 scope.go:117] "RemoveContainer" containerID="1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856" Oct 04 05:03:33 crc kubenswrapper[4574]: E1004 05:03:33.736781 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856\": container with ID starting with 1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856 not found: ID does not exist" containerID="1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.736827 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856"} err="failed to get container status \"1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856\": rpc error: code = NotFound desc = could not find container \"1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856\": container with ID starting with 1cf8c5d661117a0f045ae94d537e7dca277f0d96b24a45e8eae6b267f6df9856 not found: ID does not exist" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.736853 4574 scope.go:117] "RemoveContainer" containerID="b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3" Oct 04 05:03:33 crc kubenswrapper[4574]: E1004 05:03:33.737263 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3\": container with ID starting with b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3 not found: ID does not exist" containerID="b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3" Oct 04 05:03:33 crc kubenswrapper[4574]: I1004 05:03:33.737313 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3"} err="failed to get container status \"b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3\": rpc error: code = NotFound desc = could not find container \"b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3\": container with ID starting with b14cb4d801e66b8887b3a4a5f4a87486f34f1cb74b09dfa817326512a15a86c3 not found: ID does not exist" Oct 04 05:03:34 crc kubenswrapper[4574]: I1004 05:03:34.743977 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15175a86-4332-4519-a593-3914d0686e66" path="/var/lib/kubelet/pods/15175a86-4332-4519-a593-3914d0686e66/volumes" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.685598 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.686042 4574 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.686057 4574 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.686098 4574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift podName:74b762df-991e-4e0c-9be6-c3e468408254 nodeName:}" failed. No retries permitted until 2025-10-04 05:03:53.686084678 +0000 UTC m=+1059.540227720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift") pod "swift-storage-0" (UID: "74b762df-991e-4e0c-9be6-c3e468408254") : configmap "swift-ring-files" not found Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.723554 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-69hmk"] Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.723925 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15175a86-4332-4519-a593-3914d0686e66" containerName="dnsmasq-dns" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.723942 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="15175a86-4332-4519-a593-3914d0686e66" containerName="dnsmasq-dns" Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.723961 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c70fc86-1d9c-43d4-aa53-1881f72d56b6" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.723968 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c70fc86-1d9c-43d4-aa53-1881f72d56b6" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.723976 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab095f1-eeec-4911-bc7b-35acc57e729c" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.723982 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab095f1-eeec-4911-bc7b-35acc57e729c" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: E1004 05:03:37.723991 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15175a86-4332-4519-a593-3914d0686e66" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.723996 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="15175a86-4332-4519-a593-3914d0686e66" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.724164 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="15175a86-4332-4519-a593-3914d0686e66" containerName="dnsmasq-dns" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.724177 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab095f1-eeec-4911-bc7b-35acc57e729c" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.724188 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c70fc86-1d9c-43d4-aa53-1881f72d56b6" containerName="init" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.724695 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.735510 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-69hmk"] Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.787226 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg9fv\" (UniqueName: \"kubernetes.io/projected/03382769-56f5-45dc-b69c-099992058074-kube-api-access-lg9fv\") pod \"keystone-db-create-69hmk\" (UID: \"03382769-56f5-45dc-b69c-099992058074\") " pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.889088 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg9fv\" (UniqueName: \"kubernetes.io/projected/03382769-56f5-45dc-b69c-099992058074-kube-api-access-lg9fv\") pod \"keystone-db-create-69hmk\" (UID: \"03382769-56f5-45dc-b69c-099992058074\") " pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:37 crc kubenswrapper[4574]: I1004 05:03:37.909440 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg9fv\" (UniqueName: \"kubernetes.io/projected/03382769-56f5-45dc-b69c-099992058074-kube-api-access-lg9fv\") pod \"keystone-db-create-69hmk\" (UID: \"03382769-56f5-45dc-b69c-099992058074\") " pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.054699 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.125443 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-9x4kd"] Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.126604 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.141703 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9x4kd"] Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.194767 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlgf2\" (UniqueName: \"kubernetes.io/projected/944a360b-002a-4b31-8222-4a2949291694-kube-api-access-vlgf2\") pod \"placement-db-create-9x4kd\" (UID: \"944a360b-002a-4b31-8222-4a2949291694\") " pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.296096 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlgf2\" (UniqueName: \"kubernetes.io/projected/944a360b-002a-4b31-8222-4a2949291694-kube-api-access-vlgf2\") pod \"placement-db-create-9x4kd\" (UID: \"944a360b-002a-4b31-8222-4a2949291694\") " pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.320204 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlgf2\" (UniqueName: \"kubernetes.io/projected/944a360b-002a-4b31-8222-4a2949291694-kube-api-access-vlgf2\") pod \"placement-db-create-9x4kd\" (UID: \"944a360b-002a-4b31-8222-4a2949291694\") " pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.346938 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ht5jl"] Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.347978 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.370849 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ht5jl"] Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.397944 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-646sv\" (UniqueName: \"kubernetes.io/projected/fb796fd0-70b3-42d1-8b93-c034287498f6-kube-api-access-646sv\") pod \"glance-db-create-ht5jl\" (UID: \"fb796fd0-70b3-42d1-8b93-c034287498f6\") " pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.486864 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.499818 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-646sv\" (UniqueName: \"kubernetes.io/projected/fb796fd0-70b3-42d1-8b93-c034287498f6-kube-api-access-646sv\") pod \"glance-db-create-ht5jl\" (UID: \"fb796fd0-70b3-42d1-8b93-c034287498f6\") " pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.522172 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-646sv\" (UniqueName: \"kubernetes.io/projected/fb796fd0-70b3-42d1-8b93-c034287498f6-kube-api-access-646sv\") pod \"glance-db-create-ht5jl\" (UID: \"fb796fd0-70b3-42d1-8b93-c034287498f6\") " pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.621040 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-69hmk"] Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.679258 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.754948 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69hmk" event={"ID":"03382769-56f5-45dc-b69c-099992058074","Type":"ContainerStarted","Data":"b3cc4cb85455c7d3e486ef9ea045094c4ecb29a20822a86bb45aa8e4aac22cd2"} Oct 04 05:03:38 crc kubenswrapper[4574]: I1004 05:03:38.986638 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9x4kd"] Oct 04 05:03:39 crc kubenswrapper[4574]: E1004 05:03:39.146584 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03382769_56f5_45dc_b69c_099992058074.slice/crio-conmon-78e7609ad41fa3866fe0ad635849f1eb6b025c249893a32cb355686451bf595d.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.227444 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ht5jl"] Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.756382 4574 generic.go:334] "Generic (PLEG): container finished" podID="944a360b-002a-4b31-8222-4a2949291694" containerID="47aa5f056a57adfb702f38a7cd64aeda8d8aefd0a541963a7d1c7efd53f52b9b" exitCode=0 Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.756467 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9x4kd" event={"ID":"944a360b-002a-4b31-8222-4a2949291694","Type":"ContainerDied","Data":"47aa5f056a57adfb702f38a7cd64aeda8d8aefd0a541963a7d1c7efd53f52b9b"} Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.756497 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9x4kd" event={"ID":"944a360b-002a-4b31-8222-4a2949291694","Type":"ContainerStarted","Data":"3b48ed766b0c82855e81e23bdac78526acdfc7cccfa622ba25a20947ebacb9e2"} Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.757926 4574 generic.go:334] "Generic (PLEG): container finished" podID="03382769-56f5-45dc-b69c-099992058074" containerID="78e7609ad41fa3866fe0ad635849f1eb6b025c249893a32cb355686451bf595d" exitCode=0 Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.757977 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69hmk" event={"ID":"03382769-56f5-45dc-b69c-099992058074","Type":"ContainerDied","Data":"78e7609ad41fa3866fe0ad635849f1eb6b025c249893a32cb355686451bf595d"} Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.759828 4574 generic.go:334] "Generic (PLEG): container finished" podID="16df8292-9780-4212-a920-bf0eed95da87" containerID="2428074d47972d1f6fdd6c280ab98af22b8aa63b1019d3e79680f303071f5225" exitCode=0 Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.759900 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"16df8292-9780-4212-a920-bf0eed95da87","Type":"ContainerDied","Data":"2428074d47972d1f6fdd6c280ab98af22b8aa63b1019d3e79680f303071f5225"} Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.767897 4574 generic.go:334] "Generic (PLEG): container finished" podID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerID="19e755d98857189714271accecdc264d16ad48dbf72fe80113eb003f0a2478ba" exitCode=0 Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.768102 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e3699c-e19d-4c38-b763-32af874a1a90","Type":"ContainerDied","Data":"19e755d98857189714271accecdc264d16ad48dbf72fe80113eb003f0a2478ba"} Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.771321 4574 generic.go:334] "Generic (PLEG): container finished" podID="fb796fd0-70b3-42d1-8b93-c034287498f6" containerID="4ff59bb7c1fff34c989100573e0415b28f453a891764111427b5f099b1090b84" exitCode=0 Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.771375 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ht5jl" event={"ID":"fb796fd0-70b3-42d1-8b93-c034287498f6","Type":"ContainerDied","Data":"4ff59bb7c1fff34c989100573e0415b28f453a891764111427b5f099b1090b84"} Oct 04 05:03:39 crc kubenswrapper[4574]: I1004 05:03:39.771400 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ht5jl" event={"ID":"fb796fd0-70b3-42d1-8b93-c034287498f6","Type":"ContainerStarted","Data":"e3e626958ec4170934d589f91143faa1047da7adfb92ffc97ed52f96e455475a"} Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.780035 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"16df8292-9780-4212-a920-bf0eed95da87","Type":"ContainerStarted","Data":"d5655f18f8668b0b0b32b184f1ce68bc9a08312686a470d75b2f8870edb99e71"} Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.780466 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.781843 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e3699c-e19d-4c38-b763-32af874a1a90","Type":"ContainerStarted","Data":"1278d15b8a81afdd76322b79acff8b815745598554a13cb237126afb3b6e9dd6"} Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.782047 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.783354 4574 generic.go:334] "Generic (PLEG): container finished" podID="65ae5a48-3442-4149-9dbd-ac23191fa438" containerID="489be8b6719e7cf7f5899c10c92de956524f5023a189e3c2f2e85eb1b979ee38" exitCode=0 Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.783440 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gbs8h" event={"ID":"65ae5a48-3442-4149-9dbd-ac23191fa438","Type":"ContainerDied","Data":"489be8b6719e7cf7f5899c10c92de956524f5023a189e3c2f2e85eb1b979ee38"} Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.828229 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=47.403755081 podStartE2EDuration="58.828208688s" podCreationTimestamp="2025-10-04 05:02:42 +0000 UTC" firstStartedPulling="2025-10-04 05:02:54.701953307 +0000 UTC m=+1000.556096349" lastFinishedPulling="2025-10-04 05:03:06.126406914 +0000 UTC m=+1011.980549956" observedRunningTime="2025-10-04 05:03:40.825451349 +0000 UTC m=+1046.679594431" watchObservedRunningTime="2025-10-04 05:03:40.828208688 +0000 UTC m=+1046.682351730" Oct 04 05:03:40 crc kubenswrapper[4574]: I1004 05:03:40.875857 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.090377306 podStartE2EDuration="58.875841194s" podCreationTimestamp="2025-10-04 05:02:42 +0000 UTC" firstStartedPulling="2025-10-04 05:02:44.333603105 +0000 UTC m=+990.187746147" lastFinishedPulling="2025-10-04 05:03:06.119066993 +0000 UTC m=+1011.973210035" observedRunningTime="2025-10-04 05:03:40.864426787 +0000 UTC m=+1046.718569829" watchObservedRunningTime="2025-10-04 05:03:40.875841194 +0000 UTC m=+1046.729984236" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.374148 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.383844 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.390960 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.478012 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlgf2\" (UniqueName: \"kubernetes.io/projected/944a360b-002a-4b31-8222-4a2949291694-kube-api-access-vlgf2\") pod \"944a360b-002a-4b31-8222-4a2949291694\" (UID: \"944a360b-002a-4b31-8222-4a2949291694\") " Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.478154 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-646sv\" (UniqueName: \"kubernetes.io/projected/fb796fd0-70b3-42d1-8b93-c034287498f6-kube-api-access-646sv\") pod \"fb796fd0-70b3-42d1-8b93-c034287498f6\" (UID: \"fb796fd0-70b3-42d1-8b93-c034287498f6\") " Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.478448 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg9fv\" (UniqueName: \"kubernetes.io/projected/03382769-56f5-45dc-b69c-099992058074-kube-api-access-lg9fv\") pod \"03382769-56f5-45dc-b69c-099992058074\" (UID: \"03382769-56f5-45dc-b69c-099992058074\") " Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.492420 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03382769-56f5-45dc-b69c-099992058074-kube-api-access-lg9fv" (OuterVolumeSpecName: "kube-api-access-lg9fv") pod "03382769-56f5-45dc-b69c-099992058074" (UID: "03382769-56f5-45dc-b69c-099992058074"). InnerVolumeSpecName "kube-api-access-lg9fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.492553 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944a360b-002a-4b31-8222-4a2949291694-kube-api-access-vlgf2" (OuterVolumeSpecName: "kube-api-access-vlgf2") pod "944a360b-002a-4b31-8222-4a2949291694" (UID: "944a360b-002a-4b31-8222-4a2949291694"). InnerVolumeSpecName "kube-api-access-vlgf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.492606 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb796fd0-70b3-42d1-8b93-c034287498f6-kube-api-access-646sv" (OuterVolumeSpecName: "kube-api-access-646sv") pod "fb796fd0-70b3-42d1-8b93-c034287498f6" (UID: "fb796fd0-70b3-42d1-8b93-c034287498f6"). InnerVolumeSpecName "kube-api-access-646sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.580661 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg9fv\" (UniqueName: \"kubernetes.io/projected/03382769-56f5-45dc-b69c-099992058074-kube-api-access-lg9fv\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.580700 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlgf2\" (UniqueName: \"kubernetes.io/projected/944a360b-002a-4b31-8222-4a2949291694-kube-api-access-vlgf2\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.580713 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-646sv\" (UniqueName: \"kubernetes.io/projected/fb796fd0-70b3-42d1-8b93-c034287498f6-kube-api-access-646sv\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.793545 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ht5jl" event={"ID":"fb796fd0-70b3-42d1-8b93-c034287498f6","Type":"ContainerDied","Data":"e3e626958ec4170934d589f91143faa1047da7adfb92ffc97ed52f96e455475a"} Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.793591 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e626958ec4170934d589f91143faa1047da7adfb92ffc97ed52f96e455475a" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.793663 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ht5jl" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.795957 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9x4kd" event={"ID":"944a360b-002a-4b31-8222-4a2949291694","Type":"ContainerDied","Data":"3b48ed766b0c82855e81e23bdac78526acdfc7cccfa622ba25a20947ebacb9e2"} Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.795992 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b48ed766b0c82855e81e23bdac78526acdfc7cccfa622ba25a20947ebacb9e2" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.796053 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9x4kd" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.812154 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69hmk" Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.812191 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69hmk" event={"ID":"03382769-56f5-45dc-b69c-099992058074","Type":"ContainerDied","Data":"b3cc4cb85455c7d3e486ef9ea045094c4ecb29a20822a86bb45aa8e4aac22cd2"} Oct 04 05:03:41 crc kubenswrapper[4574]: I1004 05:03:41.812222 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3cc4cb85455c7d3e486ef9ea045094c4ecb29a20822a86bb45aa8e4aac22cd2" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.140798 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.196916 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skm7w\" (UniqueName: \"kubernetes.io/projected/65ae5a48-3442-4149-9dbd-ac23191fa438-kube-api-access-skm7w\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.196990 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-combined-ca-bundle\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.197035 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-ring-data-devices\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.197078 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65ae5a48-3442-4149-9dbd-ac23191fa438-etc-swift\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.197158 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-dispersionconf\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.197215 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-scripts\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.197249 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-swiftconf\") pod \"65ae5a48-3442-4149-9dbd-ac23191fa438\" (UID: \"65ae5a48-3442-4149-9dbd-ac23191fa438\") " Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.201636 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.203494 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ae5a48-3442-4149-9dbd-ac23191fa438-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.217135 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.223507 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ae5a48-3442-4149-9dbd-ac23191fa438-kube-api-access-skm7w" (OuterVolumeSpecName: "kube-api-access-skm7w") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "kube-api-access-skm7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.233987 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-scripts" (OuterVolumeSpecName: "scripts") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.252723 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.255375 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65ae5a48-3442-4149-9dbd-ac23191fa438" (UID: "65ae5a48-3442-4149-9dbd-ac23191fa438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299269 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skm7w\" (UniqueName: \"kubernetes.io/projected/65ae5a48-3442-4149-9dbd-ac23191fa438-kube-api-access-skm7w\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299517 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299533 4574 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299542 4574 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65ae5a48-3442-4149-9dbd-ac23191fa438-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299552 4574 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299560 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65ae5a48-3442-4149-9dbd-ac23191fa438-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.299567 4574 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65ae5a48-3442-4149-9dbd-ac23191fa438-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.821985 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gbs8h" event={"ID":"65ae5a48-3442-4149-9dbd-ac23191fa438","Type":"ContainerDied","Data":"b3320748261642326f52265025407b474b05fd28d719e1798845ba9b8df5a7a3"} Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.822032 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3320748261642326f52265025407b474b05fd28d719e1798845ba9b8df5a7a3" Oct 04 05:03:42 crc kubenswrapper[4574]: I1004 05:03:42.822093 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gbs8h" Oct 04 05:03:44 crc kubenswrapper[4574]: I1004 05:03:44.236226 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-khsmk" podUID="3836030c-f0c4-4392-bc54-cc817fd89934" containerName="ovn-controller" probeResult="failure" output=< Oct 04 05:03:44 crc kubenswrapper[4574]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 04 05:03:44 crc kubenswrapper[4574]: > Oct 04 05:03:44 crc kubenswrapper[4574]: I1004 05:03:44.549924 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.746460 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1227-account-create-txzkj"] Oct 04 05:03:47 crc kubenswrapper[4574]: E1004 05:03:47.748018 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03382769-56f5-45dc-b69c-099992058074" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748106 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="03382769-56f5-45dc-b69c-099992058074" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: E1004 05:03:47.748175 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ae5a48-3442-4149-9dbd-ac23191fa438" containerName="swift-ring-rebalance" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748276 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ae5a48-3442-4149-9dbd-ac23191fa438" containerName="swift-ring-rebalance" Oct 04 05:03:47 crc kubenswrapper[4574]: E1004 05:03:47.748363 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944a360b-002a-4b31-8222-4a2949291694" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748420 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="944a360b-002a-4b31-8222-4a2949291694" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: E1004 05:03:47.748478 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb796fd0-70b3-42d1-8b93-c034287498f6" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748532 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb796fd0-70b3-42d1-8b93-c034287498f6" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748754 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ae5a48-3442-4149-9dbd-ac23191fa438" containerName="swift-ring-rebalance" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748820 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="03382769-56f5-45dc-b69c-099992058074" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748879 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="944a360b-002a-4b31-8222-4a2949291694" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.748936 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb796fd0-70b3-42d1-8b93-c034287498f6" containerName="mariadb-database-create" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.749540 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.759484 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.795999 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1227-account-create-txzkj"] Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.796698 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z2cl\" (UniqueName: \"kubernetes.io/projected/cb3ee3f0-67c6-4a05-aa14-a5602906d364-kube-api-access-7z2cl\") pod \"keystone-1227-account-create-txzkj\" (UID: \"cb3ee3f0-67c6-4a05-aa14-a5602906d364\") " pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.898464 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z2cl\" (UniqueName: \"kubernetes.io/projected/cb3ee3f0-67c6-4a05-aa14-a5602906d364-kube-api-access-7z2cl\") pod \"keystone-1227-account-create-txzkj\" (UID: \"cb3ee3f0-67c6-4a05-aa14-a5602906d364\") " pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:47 crc kubenswrapper[4574]: I1004 05:03:47.927108 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z2cl\" (UniqueName: \"kubernetes.io/projected/cb3ee3f0-67c6-4a05-aa14-a5602906d364-kube-api-access-7z2cl\") pod \"keystone-1227-account-create-txzkj\" (UID: \"cb3ee3f0-67c6-4a05-aa14-a5602906d364\") " pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.067359 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.264451 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2a59-account-create-b4x5h"] Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.268632 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.272015 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.290759 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a59-account-create-b4x5h"] Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.305672 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fvbm\" (UniqueName: \"kubernetes.io/projected/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220-kube-api-access-6fvbm\") pod \"placement-2a59-account-create-b4x5h\" (UID: \"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220\") " pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.408082 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fvbm\" (UniqueName: \"kubernetes.io/projected/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220-kube-api-access-6fvbm\") pod \"placement-2a59-account-create-b4x5h\" (UID: \"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220\") " pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.433094 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fvbm\" (UniqueName: \"kubernetes.io/projected/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220-kube-api-access-6fvbm\") pod \"placement-2a59-account-create-b4x5h\" (UID: \"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220\") " pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.459798 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-84c7-account-create-bjdjw"] Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.460807 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.464540 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.479378 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-84c7-account-create-bjdjw"] Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.509868 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqr6n\" (UniqueName: \"kubernetes.io/projected/b8f863e6-e6a0-44e9-afbe-f4734c7e2416-kube-api-access-zqr6n\") pod \"glance-84c7-account-create-bjdjw\" (UID: \"b8f863e6-e6a0-44e9-afbe-f4734c7e2416\") " pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.567013 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1227-account-create-txzkj"] Oct 04 05:03:48 crc kubenswrapper[4574]: W1004 05:03:48.590651 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3ee3f0_67c6_4a05_aa14_a5602906d364.slice/crio-13d8ab3a86d5c9ab43aeee0617c206c1f272d581bd164b6ec899ffe4b69c636d WatchSource:0}: Error finding container 13d8ab3a86d5c9ab43aeee0617c206c1f272d581bd164b6ec899ffe4b69c636d: Status 404 returned error can't find the container with id 13d8ab3a86d5c9ab43aeee0617c206c1f272d581bd164b6ec899ffe4b69c636d Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.600859 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.611380 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqr6n\" (UniqueName: \"kubernetes.io/projected/b8f863e6-e6a0-44e9-afbe-f4734c7e2416-kube-api-access-zqr6n\") pod \"glance-84c7-account-create-bjdjw\" (UID: \"b8f863e6-e6a0-44e9-afbe-f4734c7e2416\") " pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.631653 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqr6n\" (UniqueName: \"kubernetes.io/projected/b8f863e6-e6a0-44e9-afbe-f4734c7e2416-kube-api-access-zqr6n\") pod \"glance-84c7-account-create-bjdjw\" (UID: \"b8f863e6-e6a0-44e9-afbe-f4734c7e2416\") " pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.788221 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.867110 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1227-account-create-txzkj" event={"ID":"cb3ee3f0-67c6-4a05-aa14-a5602906d364","Type":"ContainerStarted","Data":"df7b5541b583b8d5d6dd1d9e1606a1c96f1b8c8717d8e3dd8d38efbb8a446b10"} Oct 04 05:03:48 crc kubenswrapper[4574]: I1004 05:03:48.867155 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1227-account-create-txzkj" event={"ID":"cb3ee3f0-67c6-4a05-aa14-a5602906d364","Type":"ContainerStarted","Data":"13d8ab3a86d5c9ab43aeee0617c206c1f272d581bd164b6ec899ffe4b69c636d"} Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.085574 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-1227-account-create-txzkj" podStartSLOduration=2.085555368 podStartE2EDuration="2.085555368s" podCreationTimestamp="2025-10-04 05:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:03:48.890627328 +0000 UTC m=+1054.744770370" watchObservedRunningTime="2025-10-04 05:03:49.085555368 +0000 UTC m=+1054.939698410" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.092309 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a59-account-create-b4x5h"] Oct 04 05:03:49 crc kubenswrapper[4574]: W1004 05:03:49.094637 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbba5c4cf_83d4_4518_9ce5_73dd9d4e0220.slice/crio-9ea3a3dd55020819f886c71831e342a47dc5c5734adf0e6a430a6759463a44a6 WatchSource:0}: Error finding container 9ea3a3dd55020819f886c71831e342a47dc5c5734adf0e6a430a6759463a44a6: Status 404 returned error can't find the container with id 9ea3a3dd55020819f886c71831e342a47dc5c5734adf0e6a430a6759463a44a6 Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.123856 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-84c7-account-create-bjdjw"] Oct 04 05:03:49 crc kubenswrapper[4574]: W1004 05:03:49.135212 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f863e6_e6a0_44e9_afbe_f4734c7e2416.slice/crio-a036c05f765713106c8076bcf025ac7ef7e7f4bc5c0659531b69466236724b72 WatchSource:0}: Error finding container a036c05f765713106c8076bcf025ac7ef7e7f4bc5c0659531b69466236724b72: Status 404 returned error can't find the container with id a036c05f765713106c8076bcf025ac7ef7e7f4bc5c0659531b69466236724b72 Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.279317 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-khsmk" podUID="3836030c-f0c4-4392-bc54-cc817fd89934" containerName="ovn-controller" probeResult="failure" output=< Oct 04 05:03:49 crc kubenswrapper[4574]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 04 05:03:49 crc kubenswrapper[4574]: > Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.292599 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.301765 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gl29s" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.543955 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-khsmk-config-d4ngx"] Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.544949 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.547129 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.622136 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-khsmk-config-d4ngx"] Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.653880 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-log-ovn\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.653924 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-additional-scripts\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.653950 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-scripts\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.653986 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7588v\" (UniqueName: \"kubernetes.io/projected/71ac1418-d604-444e-8faf-48489cff88ca-kube-api-access-7588v\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.654022 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run-ovn\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.654118 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756183 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756321 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-log-ovn\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756352 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-additional-scripts\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756378 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-scripts\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756423 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7588v\" (UniqueName: \"kubernetes.io/projected/71ac1418-d604-444e-8faf-48489cff88ca-kube-api-access-7588v\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756468 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run-ovn\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756607 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756687 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run-ovn\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.756733 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-log-ovn\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.757858 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-additional-scripts\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.758893 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-scripts\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.784947 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7588v\" (UniqueName: \"kubernetes.io/projected/71ac1418-d604-444e-8faf-48489cff88ca-kube-api-access-7588v\") pod \"ovn-controller-khsmk-config-d4ngx\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.861273 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.892129 4574 generic.go:334] "Generic (PLEG): container finished" podID="cb3ee3f0-67c6-4a05-aa14-a5602906d364" containerID="df7b5541b583b8d5d6dd1d9e1606a1c96f1b8c8717d8e3dd8d38efbb8a446b10" exitCode=0 Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.892226 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1227-account-create-txzkj" event={"ID":"cb3ee3f0-67c6-4a05-aa14-a5602906d364","Type":"ContainerDied","Data":"df7b5541b583b8d5d6dd1d9e1606a1c96f1b8c8717d8e3dd8d38efbb8a446b10"} Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.909959 4574 generic.go:334] "Generic (PLEG): container finished" podID="b8f863e6-e6a0-44e9-afbe-f4734c7e2416" containerID="2d9abd6a53e5fe383c3bcc7fa051c48e4c89ec695bdb630ff2cc65c14c7c5114" exitCode=0 Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.910058 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84c7-account-create-bjdjw" event={"ID":"b8f863e6-e6a0-44e9-afbe-f4734c7e2416","Type":"ContainerDied","Data":"2d9abd6a53e5fe383c3bcc7fa051c48e4c89ec695bdb630ff2cc65c14c7c5114"} Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.910090 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84c7-account-create-bjdjw" event={"ID":"b8f863e6-e6a0-44e9-afbe-f4734c7e2416","Type":"ContainerStarted","Data":"a036c05f765713106c8076bcf025ac7ef7e7f4bc5c0659531b69466236724b72"} Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.916122 4574 generic.go:334] "Generic (PLEG): container finished" podID="bba5c4cf-83d4-4518-9ce5-73dd9d4e0220" containerID="35de2fbc9f58be3c2711718e28d35dae95b4d2c1baeb9879af535623cad155a7" exitCode=0 Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.917273 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a59-account-create-b4x5h" event={"ID":"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220","Type":"ContainerDied","Data":"35de2fbc9f58be3c2711718e28d35dae95b4d2c1baeb9879af535623cad155a7"} Oct 04 05:03:49 crc kubenswrapper[4574]: I1004 05:03:49.917306 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a59-account-create-b4x5h" event={"ID":"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220","Type":"ContainerStarted","Data":"9ea3a3dd55020819f886c71831e342a47dc5c5734adf0e6a430a6759463a44a6"} Oct 04 05:03:50 crc kubenswrapper[4574]: I1004 05:03:50.163992 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-khsmk-config-d4ngx"] Oct 04 05:03:50 crc kubenswrapper[4574]: I1004 05:03:50.925143 4574 generic.go:334] "Generic (PLEG): container finished" podID="71ac1418-d604-444e-8faf-48489cff88ca" containerID="76051f35e51e8a4820c47cfcc47ff16bc1456f3d7a405600540c6f19a3b961a8" exitCode=0 Oct 04 05:03:50 crc kubenswrapper[4574]: I1004 05:03:50.925194 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-d4ngx" event={"ID":"71ac1418-d604-444e-8faf-48489cff88ca","Type":"ContainerDied","Data":"76051f35e51e8a4820c47cfcc47ff16bc1456f3d7a405600540c6f19a3b961a8"} Oct 04 05:03:50 crc kubenswrapper[4574]: I1004 05:03:50.925568 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-d4ngx" event={"ID":"71ac1418-d604-444e-8faf-48489cff88ca","Type":"ContainerStarted","Data":"8180e00b1a1a101539c67488147720efa3fdb79e88db06b16459b1cdca6eb706"} Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.347937 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.444272 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.454678 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.494212 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fvbm\" (UniqueName: \"kubernetes.io/projected/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220-kube-api-access-6fvbm\") pod \"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220\" (UID: \"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220\") " Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.503728 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220-kube-api-access-6fvbm" (OuterVolumeSpecName: "kube-api-access-6fvbm") pod "bba5c4cf-83d4-4518-9ce5-73dd9d4e0220" (UID: "bba5c4cf-83d4-4518-9ce5-73dd9d4e0220"). InnerVolumeSpecName "kube-api-access-6fvbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.595719 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z2cl\" (UniqueName: \"kubernetes.io/projected/cb3ee3f0-67c6-4a05-aa14-a5602906d364-kube-api-access-7z2cl\") pod \"cb3ee3f0-67c6-4a05-aa14-a5602906d364\" (UID: \"cb3ee3f0-67c6-4a05-aa14-a5602906d364\") " Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.595869 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqr6n\" (UniqueName: \"kubernetes.io/projected/b8f863e6-e6a0-44e9-afbe-f4734c7e2416-kube-api-access-zqr6n\") pod \"b8f863e6-e6a0-44e9-afbe-f4734c7e2416\" (UID: \"b8f863e6-e6a0-44e9-afbe-f4734c7e2416\") " Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.596434 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fvbm\" (UniqueName: \"kubernetes.io/projected/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220-kube-api-access-6fvbm\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.599037 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f863e6-e6a0-44e9-afbe-f4734c7e2416-kube-api-access-zqr6n" (OuterVolumeSpecName: "kube-api-access-zqr6n") pod "b8f863e6-e6a0-44e9-afbe-f4734c7e2416" (UID: "b8f863e6-e6a0-44e9-afbe-f4734c7e2416"). InnerVolumeSpecName "kube-api-access-zqr6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.599721 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3ee3f0-67c6-4a05-aa14-a5602906d364-kube-api-access-7z2cl" (OuterVolumeSpecName: "kube-api-access-7z2cl") pod "cb3ee3f0-67c6-4a05-aa14-a5602906d364" (UID: "cb3ee3f0-67c6-4a05-aa14-a5602906d364"). InnerVolumeSpecName "kube-api-access-7z2cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.698064 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z2cl\" (UniqueName: \"kubernetes.io/projected/cb3ee3f0-67c6-4a05-aa14-a5602906d364-kube-api-access-7z2cl\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.698119 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqr6n\" (UniqueName: \"kubernetes.io/projected/b8f863e6-e6a0-44e9-afbe-f4734c7e2416-kube-api-access-zqr6n\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.935011 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1227-account-create-txzkj" event={"ID":"cb3ee3f0-67c6-4a05-aa14-a5602906d364","Type":"ContainerDied","Data":"13d8ab3a86d5c9ab43aeee0617c206c1f272d581bd164b6ec899ffe4b69c636d"} Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.935084 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d8ab3a86d5c9ab43aeee0617c206c1f272d581bd164b6ec899ffe4b69c636d" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.935051 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1227-account-create-txzkj" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.936666 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84c7-account-create-bjdjw" event={"ID":"b8f863e6-e6a0-44e9-afbe-f4734c7e2416","Type":"ContainerDied","Data":"a036c05f765713106c8076bcf025ac7ef7e7f4bc5c0659531b69466236724b72"} Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.936712 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a036c05f765713106c8076bcf025ac7ef7e7f4bc5c0659531b69466236724b72" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.936679 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84c7-account-create-bjdjw" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.938095 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a59-account-create-b4x5h" Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.941443 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a59-account-create-b4x5h" event={"ID":"bba5c4cf-83d4-4518-9ce5-73dd9d4e0220","Type":"ContainerDied","Data":"9ea3a3dd55020819f886c71831e342a47dc5c5734adf0e6a430a6759463a44a6"} Oct 04 05:03:51 crc kubenswrapper[4574]: I1004 05:03:51.941489 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea3a3dd55020819f886c71831e342a47dc5c5734adf0e6a430a6759463a44a6" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.142126 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206048 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-scripts\") pod \"71ac1418-d604-444e-8faf-48489cff88ca\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206142 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7588v\" (UniqueName: \"kubernetes.io/projected/71ac1418-d604-444e-8faf-48489cff88ca-kube-api-access-7588v\") pod \"71ac1418-d604-444e-8faf-48489cff88ca\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206164 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run-ovn\") pod \"71ac1418-d604-444e-8faf-48489cff88ca\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206222 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-additional-scripts\") pod \"71ac1418-d604-444e-8faf-48489cff88ca\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206341 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-log-ovn\") pod \"71ac1418-d604-444e-8faf-48489cff88ca\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206377 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run\") pod \"71ac1418-d604-444e-8faf-48489cff88ca\" (UID: \"71ac1418-d604-444e-8faf-48489cff88ca\") " Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.206842 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run" (OuterVolumeSpecName: "var-run") pod "71ac1418-d604-444e-8faf-48489cff88ca" (UID: "71ac1418-d604-444e-8faf-48489cff88ca"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.207551 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "71ac1418-d604-444e-8faf-48489cff88ca" (UID: "71ac1418-d604-444e-8faf-48489cff88ca"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.207591 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "71ac1418-d604-444e-8faf-48489cff88ca" (UID: "71ac1418-d604-444e-8faf-48489cff88ca"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.207895 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "71ac1418-d604-444e-8faf-48489cff88ca" (UID: "71ac1418-d604-444e-8faf-48489cff88ca"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.208493 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-scripts" (OuterVolumeSpecName: "scripts") pod "71ac1418-d604-444e-8faf-48489cff88ca" (UID: "71ac1418-d604-444e-8faf-48489cff88ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.211655 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ac1418-d604-444e-8faf-48489cff88ca-kube-api-access-7588v" (OuterVolumeSpecName: "kube-api-access-7588v") pod "71ac1418-d604-444e-8faf-48489cff88ca" (UID: "71ac1418-d604-444e-8faf-48489cff88ca"). InnerVolumeSpecName "kube-api-access-7588v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.308461 4574 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.308492 4574 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.308503 4574 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.308512 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71ac1418-d604-444e-8faf-48489cff88ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.308521 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7588v\" (UniqueName: \"kubernetes.io/projected/71ac1418-d604-444e-8faf-48489cff88ca-kube-api-access-7588v\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.308531 4574 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/71ac1418-d604-444e-8faf-48489cff88ca-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.956195 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-d4ngx" event={"ID":"71ac1418-d604-444e-8faf-48489cff88ca","Type":"ContainerDied","Data":"8180e00b1a1a101539c67488147720efa3fdb79e88db06b16459b1cdca6eb706"} Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.956270 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8180e00b1a1a101539c67488147720efa3fdb79e88db06b16459b1cdca6eb706" Oct 04 05:03:52 crc kubenswrapper[4574]: I1004 05:03:52.956291 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-d4ngx" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.238747 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-khsmk-config-d4ngx"] Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.246534 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-khsmk-config-d4ngx"] Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.360964 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-khsmk-config-kcbhn"] Oct 04 05:03:53 crc kubenswrapper[4574]: E1004 05:03:53.361312 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f863e6-e6a0-44e9-afbe-f4734c7e2416" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361327 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f863e6-e6a0-44e9-afbe-f4734c7e2416" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: E1004 05:03:53.361343 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ac1418-d604-444e-8faf-48489cff88ca" containerName="ovn-config" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361350 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ac1418-d604-444e-8faf-48489cff88ca" containerName="ovn-config" Oct 04 05:03:53 crc kubenswrapper[4574]: E1004 05:03:53.361360 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3ee3f0-67c6-4a05-aa14-a5602906d364" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361366 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3ee3f0-67c6-4a05-aa14-a5602906d364" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: E1004 05:03:53.361376 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba5c4cf-83d4-4518-9ce5-73dd9d4e0220" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361382 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba5c4cf-83d4-4518-9ce5-73dd9d4e0220" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361532 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ac1418-d604-444e-8faf-48489cff88ca" containerName="ovn-config" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361551 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba5c4cf-83d4-4518-9ce5-73dd9d4e0220" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361566 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f863e6-e6a0-44e9-afbe-f4734c7e2416" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.361576 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3ee3f0-67c6-4a05-aa14-a5602906d364" containerName="mariadb-account-create" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.362076 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.363917 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.377776 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-khsmk-config-kcbhn"] Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.426633 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-log-ovn\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.426707 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-scripts\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.426763 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.426835 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srgdd\" (UniqueName: \"kubernetes.io/projected/4606794e-3d8f-4aa5-80fb-c918ffd40937-kube-api-access-srgdd\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.426866 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-additional-scripts\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.426884 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run-ovn\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528116 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-log-ovn\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528184 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-scripts\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528250 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528330 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srgdd\" (UniqueName: \"kubernetes.io/projected/4606794e-3d8f-4aa5-80fb-c918ffd40937-kube-api-access-srgdd\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528366 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-additional-scripts\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528392 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run-ovn\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528753 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run-ovn\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.528825 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-log-ovn\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.529539 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.530468 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-additional-scripts\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.531175 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-scripts\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.551350 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srgdd\" (UniqueName: \"kubernetes.io/projected/4606794e-3d8f-4aa5-80fb-c918ffd40937-kube-api-access-srgdd\") pod \"ovn-controller-khsmk-config-kcbhn\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.679287 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.703323 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.723069 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xnbbl"] Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.724500 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.730038 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.732924 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zkflm" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.734851 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.738079 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnbbl"] Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.744512 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/74b762df-991e-4e0c-9be6-c3e468408254-etc-swift\") pod \"swift-storage-0\" (UID: \"74b762df-991e-4e0c-9be6-c3e468408254\") " pod="openstack/swift-storage-0" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.843921 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-combined-ca-bundle\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.844011 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pm6\" (UniqueName: \"kubernetes.io/projected/59854ff7-fdcf-4a21-9fa6-9ab422be068e-kube-api-access-q8pm6\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.844048 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-config-data\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.844734 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-db-sync-config-data\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.905711 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.946724 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-db-sync-config-data\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.946835 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-combined-ca-bundle\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.946877 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pm6\" (UniqueName: \"kubernetes.io/projected/59854ff7-fdcf-4a21-9fa6-9ab422be068e-kube-api-access-q8pm6\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.946911 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-config-data\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.952156 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-combined-ca-bundle\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.955360 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-config-data\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.956160 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-db-sync-config-data\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:53 crc kubenswrapper[4574]: I1004 05:03:53.977062 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pm6\" (UniqueName: \"kubernetes.io/projected/59854ff7-fdcf-4a21-9fa6-9ab422be068e-kube-api-access-q8pm6\") pod \"glance-db-sync-xnbbl\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.052547 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-khsmk-config-kcbhn"] Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.114668 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnbbl" Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.261809 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.272991 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-khsmk" Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.661279 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.746940 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ac1418-d604-444e-8faf-48489cff88ca" path="/var/lib/kubelet/pods/71ac1418-d604-444e-8faf-48489cff88ca/volumes" Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.894872 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnbbl"] Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.983863 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-kcbhn" event={"ID":"4606794e-3d8f-4aa5-80fb-c918ffd40937","Type":"ContainerStarted","Data":"0388828482fb6b864c1bef5ccd6e44ad60cb0ffe96fef79814984f65deb9ea81"} Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.983908 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-kcbhn" event={"ID":"4606794e-3d8f-4aa5-80fb-c918ffd40937","Type":"ContainerStarted","Data":"ec81ede9d228bfdbc2f546ba581336314f395d3d89baf94916647e3f7742f784"} Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.986573 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnbbl" event={"ID":"59854ff7-fdcf-4a21-9fa6-9ab422be068e","Type":"ContainerStarted","Data":"3348d72c9eceb452ec55dff079a9f2b9778ca07cb8dbdaf716989e3aa26349a6"} Oct 04 05:03:54 crc kubenswrapper[4574]: I1004 05:03:54.988134 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"f8ca66be939231f931bc75b290af3de5a9c2db8d035fa4e0fd0539062831012d"} Oct 04 05:03:55 crc kubenswrapper[4574]: I1004 05:03:55.998523 4574 generic.go:334] "Generic (PLEG): container finished" podID="4606794e-3d8f-4aa5-80fb-c918ffd40937" containerID="0388828482fb6b864c1bef5ccd6e44ad60cb0ffe96fef79814984f65deb9ea81" exitCode=0 Oct 04 05:03:55 crc kubenswrapper[4574]: I1004 05:03:55.998715 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-kcbhn" event={"ID":"4606794e-3d8f-4aa5-80fb-c918ffd40937","Type":"ContainerDied","Data":"0388828482fb6b864c1bef5ccd6e44ad60cb0ffe96fef79814984f65deb9ea81"} Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.357555 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2ssvl"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.358519 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2ssvl" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.372692 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2ssvl"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.417452 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzkdm\" (UniqueName: \"kubernetes.io/projected/f937ae2e-fb2c-4632-b030-e4547cb604bf-kube-api-access-pzkdm\") pod \"cinder-db-create-2ssvl\" (UID: \"f937ae2e-fb2c-4632-b030-e4547cb604bf\") " pod="openstack/cinder-db-create-2ssvl" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.440949 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-b96fh"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.446381 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b96fh" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.455377 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-b96fh"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.520488 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzkdm\" (UniqueName: \"kubernetes.io/projected/f937ae2e-fb2c-4632-b030-e4547cb604bf-kube-api-access-pzkdm\") pod \"cinder-db-create-2ssvl\" (UID: \"f937ae2e-fb2c-4632-b030-e4547cb604bf\") " pod="openstack/cinder-db-create-2ssvl" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.520562 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhlwt\" (UniqueName: \"kubernetes.io/projected/f80e4ab0-c323-4894-8d2d-23d6e380aeb1-kube-api-access-fhlwt\") pod \"barbican-db-create-b96fh\" (UID: \"f80e4ab0-c323-4894-8d2d-23d6e380aeb1\") " pod="openstack/barbican-db-create-b96fh" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.561751 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzkdm\" (UniqueName: \"kubernetes.io/projected/f937ae2e-fb2c-4632-b030-e4547cb604bf-kube-api-access-pzkdm\") pod \"cinder-db-create-2ssvl\" (UID: \"f937ae2e-fb2c-4632-b030-e4547cb604bf\") " pod="openstack/cinder-db-create-2ssvl" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.622218 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhlwt\" (UniqueName: \"kubernetes.io/projected/f80e4ab0-c323-4894-8d2d-23d6e380aeb1-kube-api-access-fhlwt\") pod \"barbican-db-create-b96fh\" (UID: \"f80e4ab0-c323-4894-8d2d-23d6e380aeb1\") " pod="openstack/barbican-db-create-b96fh" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.653054 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhlwt\" (UniqueName: \"kubernetes.io/projected/f80e4ab0-c323-4894-8d2d-23d6e380aeb1-kube-api-access-fhlwt\") pod \"barbican-db-create-b96fh\" (UID: \"f80e4ab0-c323-4894-8d2d-23d6e380aeb1\") " pod="openstack/barbican-db-create-b96fh" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.663417 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-s496q"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.664526 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s496q" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.665595 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-g5d9n"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.666256 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.668427 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.668736 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-49xhf" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.668809 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.671105 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.680968 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2ssvl" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.725543 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sk2k\" (UniqueName: \"kubernetes.io/projected/65e6f397-f39a-4b3b-af37-05020e371987-kube-api-access-5sk2k\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.725620 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-combined-ca-bundle\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.725692 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2cw2\" (UniqueName: \"kubernetes.io/projected/6440c95c-883f-4d9e-b095-6589637f1059-kube-api-access-b2cw2\") pod \"neutron-db-create-s496q\" (UID: \"6440c95c-883f-4d9e-b095-6589637f1059\") " pod="openstack/neutron-db-create-s496q" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.725739 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-config-data\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.766523 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b96fh" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.783327 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s496q"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.791472 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g5d9n"] Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.827654 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sk2k\" (UniqueName: \"kubernetes.io/projected/65e6f397-f39a-4b3b-af37-05020e371987-kube-api-access-5sk2k\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.827971 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-combined-ca-bundle\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.828134 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2cw2\" (UniqueName: \"kubernetes.io/projected/6440c95c-883f-4d9e-b095-6589637f1059-kube-api-access-b2cw2\") pod \"neutron-db-create-s496q\" (UID: \"6440c95c-883f-4d9e-b095-6589637f1059\") " pod="openstack/neutron-db-create-s496q" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.828249 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-config-data\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.832908 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-combined-ca-bundle\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.833968 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-config-data\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.867801 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2cw2\" (UniqueName: \"kubernetes.io/projected/6440c95c-883f-4d9e-b095-6589637f1059-kube-api-access-b2cw2\") pod \"neutron-db-create-s496q\" (UID: \"6440c95c-883f-4d9e-b095-6589637f1059\") " pod="openstack/neutron-db-create-s496q" Oct 04 05:03:56 crc kubenswrapper[4574]: I1004 05:03:56.873276 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sk2k\" (UniqueName: \"kubernetes.io/projected/65e6f397-f39a-4b3b-af37-05020e371987-kube-api-access-5sk2k\") pod \"keystone-db-sync-g5d9n\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.010847 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s496q" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.024267 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.681946 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2ssvl"] Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.872849 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954367 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-log-ovn\") pod \"4606794e-3d8f-4aa5-80fb-c918ffd40937\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954474 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-scripts\") pod \"4606794e-3d8f-4aa5-80fb-c918ffd40937\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954516 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run-ovn\") pod \"4606794e-3d8f-4aa5-80fb-c918ffd40937\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954520 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4606794e-3d8f-4aa5-80fb-c918ffd40937" (UID: "4606794e-3d8f-4aa5-80fb-c918ffd40937"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954536 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srgdd\" (UniqueName: \"kubernetes.io/projected/4606794e-3d8f-4aa5-80fb-c918ffd40937-kube-api-access-srgdd\") pod \"4606794e-3d8f-4aa5-80fb-c918ffd40937\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954828 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-additional-scripts\") pod \"4606794e-3d8f-4aa5-80fb-c918ffd40937\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.954951 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run\") pod \"4606794e-3d8f-4aa5-80fb-c918ffd40937\" (UID: \"4606794e-3d8f-4aa5-80fb-c918ffd40937\") " Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.955874 4574 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.955920 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4606794e-3d8f-4aa5-80fb-c918ffd40937" (UID: "4606794e-3d8f-4aa5-80fb-c918ffd40937"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.955931 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run" (OuterVolumeSpecName: "var-run") pod "4606794e-3d8f-4aa5-80fb-c918ffd40937" (UID: "4606794e-3d8f-4aa5-80fb-c918ffd40937"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.956028 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-scripts" (OuterVolumeSpecName: "scripts") pod "4606794e-3d8f-4aa5-80fb-c918ffd40937" (UID: "4606794e-3d8f-4aa5-80fb-c918ffd40937"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.956294 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4606794e-3d8f-4aa5-80fb-c918ffd40937" (UID: "4606794e-3d8f-4aa5-80fb-c918ffd40937"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:03:57 crc kubenswrapper[4574]: I1004 05:03:57.960587 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4606794e-3d8f-4aa5-80fb-c918ffd40937-kube-api-access-srgdd" (OuterVolumeSpecName: "kube-api-access-srgdd") pod "4606794e-3d8f-4aa5-80fb-c918ffd40937" (UID: "4606794e-3d8f-4aa5-80fb-c918ffd40937"). InnerVolumeSpecName "kube-api-access-srgdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.043017 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-khsmk-config-kcbhn" event={"ID":"4606794e-3d8f-4aa5-80fb-c918ffd40937","Type":"ContainerDied","Data":"ec81ede9d228bfdbc2f546ba581336314f395d3d89baf94916647e3f7742f784"} Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.043404 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec81ede9d228bfdbc2f546ba581336314f395d3d89baf94916647e3f7742f784" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.043501 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-khsmk-config-kcbhn" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.051542 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"27b91f6fdc7f872f0967d33ded4555c8854bf9802afbed63ae559c9c6e17c547"} Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.057505 4574 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.057533 4574 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.057542 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4606794e-3d8f-4aa5-80fb-c918ffd40937-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.057552 4574 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4606794e-3d8f-4aa5-80fb-c918ffd40937-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.057561 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srgdd\" (UniqueName: \"kubernetes.io/projected/4606794e-3d8f-4aa5-80fb-c918ffd40937-kube-api-access-srgdd\") on node \"crc\" DevicePath \"\"" Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.070755 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2ssvl" event={"ID":"f937ae2e-fb2c-4632-b030-e4547cb604bf","Type":"ContainerStarted","Data":"f777cca7511251879d483cfd060366a339100443436c44b3b3b5df53d9f00598"} Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.126420 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g5d9n"] Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.132264 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-b96fh"] Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.281887 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-s496q"] Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.946370 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-khsmk-config-kcbhn"] Oct 04 05:03:58 crc kubenswrapper[4574]: I1004 05:03:58.954866 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-khsmk-config-kcbhn"] Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.087730 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"b64d9cfc03e487d4ad1f00c61cf11474a0099fc5caa212968a24dd0d1ea04386"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.087777 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"38e8109532b0401873ad41c0d77d991aaf4bab73d1b79da421ec2d0ffe59a24f"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.087790 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"25569acd923917def40c134ae5d152cf7e30bd6e77cd2f462f8b00b6878663b2"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.090451 4574 generic.go:334] "Generic (PLEG): container finished" podID="f80e4ab0-c323-4894-8d2d-23d6e380aeb1" containerID="50a72173a69ddc6eaa1b33fc15981e0f540f181d6f9eb3a580d63c7c6aa335d1" exitCode=0 Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.090522 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b96fh" event={"ID":"f80e4ab0-c323-4894-8d2d-23d6e380aeb1","Type":"ContainerDied","Data":"50a72173a69ddc6eaa1b33fc15981e0f540f181d6f9eb3a580d63c7c6aa335d1"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.090549 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b96fh" event={"ID":"f80e4ab0-c323-4894-8d2d-23d6e380aeb1","Type":"ContainerStarted","Data":"d4e8afff3d1908cf7135e35193847f20acc467abb9e79a9086927af558dc2ee7"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.092720 4574 generic.go:334] "Generic (PLEG): container finished" podID="6440c95c-883f-4d9e-b095-6589637f1059" containerID="f6d3f86c7280e708740c2665de9fa74c18344b39bdfaede64f0187ba16a7977d" exitCode=0 Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.092781 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s496q" event={"ID":"6440c95c-883f-4d9e-b095-6589637f1059","Type":"ContainerDied","Data":"f6d3f86c7280e708740c2665de9fa74c18344b39bdfaede64f0187ba16a7977d"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.092808 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s496q" event={"ID":"6440c95c-883f-4d9e-b095-6589637f1059","Type":"ContainerStarted","Data":"0e0a215b1fa1cdcaf07f7ad1d2a448717b3f47d35c3a2cbffd5e253a472e6c7e"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.122458 4574 generic.go:334] "Generic (PLEG): container finished" podID="f937ae2e-fb2c-4632-b030-e4547cb604bf" containerID="06b333b01bed766728865b88c1ab0187ee11cb7871910ac8cedcce76ba626acc" exitCode=0 Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.122545 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2ssvl" event={"ID":"f937ae2e-fb2c-4632-b030-e4547cb604bf","Type":"ContainerDied","Data":"06b333b01bed766728865b88c1ab0187ee11cb7871910ac8cedcce76ba626acc"} Oct 04 05:03:59 crc kubenswrapper[4574]: I1004 05:03:59.125372 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g5d9n" event={"ID":"65e6f397-f39a-4b3b-af37-05020e371987","Type":"ContainerStarted","Data":"c83590f1f3635aaa026cbb51b67d3858d226eb87c4704bd9a9989838004f41f8"} Oct 04 05:04:00 crc kubenswrapper[4574]: I1004 05:04:00.749636 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4606794e-3d8f-4aa5-80fb-c918ffd40937" path="/var/lib/kubelet/pods/4606794e-3d8f-4aa5-80fb-c918ffd40937/volumes" Oct 04 05:04:02 crc kubenswrapper[4574]: I1004 05:04:02.743333 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s496q" Oct 04 05:04:02 crc kubenswrapper[4574]: I1004 05:04:02.855665 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2cw2\" (UniqueName: \"kubernetes.io/projected/6440c95c-883f-4d9e-b095-6589637f1059-kube-api-access-b2cw2\") pod \"6440c95c-883f-4d9e-b095-6589637f1059\" (UID: \"6440c95c-883f-4d9e-b095-6589637f1059\") " Oct 04 05:04:02 crc kubenswrapper[4574]: I1004 05:04:02.863667 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6440c95c-883f-4d9e-b095-6589637f1059-kube-api-access-b2cw2" (OuterVolumeSpecName: "kube-api-access-b2cw2") pod "6440c95c-883f-4d9e-b095-6589637f1059" (UID: "6440c95c-883f-4d9e-b095-6589637f1059"). InnerVolumeSpecName "kube-api-access-b2cw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:02 crc kubenswrapper[4574]: I1004 05:04:02.957229 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2cw2\" (UniqueName: \"kubernetes.io/projected/6440c95c-883f-4d9e-b095-6589637f1059-kube-api-access-b2cw2\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:03 crc kubenswrapper[4574]: I1004 05:04:03.159139 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-s496q" event={"ID":"6440c95c-883f-4d9e-b095-6589637f1059","Type":"ContainerDied","Data":"0e0a215b1fa1cdcaf07f7ad1d2a448717b3f47d35c3a2cbffd5e253a472e6c7e"} Oct 04 05:04:03 crc kubenswrapper[4574]: I1004 05:04:03.159697 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0a215b1fa1cdcaf07f7ad1d2a448717b3f47d35c3a2cbffd5e253a472e6c7e" Oct 04 05:04:03 crc kubenswrapper[4574]: I1004 05:04:03.159772 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-s496q" Oct 04 05:04:12 crc kubenswrapper[4574]: E1004 05:04:12.418503 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 04 05:04:12 crc kubenswrapper[4574]: E1004 05:04:12.419087 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q8pm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-xnbbl_openstack(59854ff7-fdcf-4a21-9fa6-9ab422be068e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:04:12 crc kubenswrapper[4574]: E1004 05:04:12.420274 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-xnbbl" podUID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.554898 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b96fh" Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.596453 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2ssvl" Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.624897 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhlwt\" (UniqueName: \"kubernetes.io/projected/f80e4ab0-c323-4894-8d2d-23d6e380aeb1-kube-api-access-fhlwt\") pod \"f80e4ab0-c323-4894-8d2d-23d6e380aeb1\" (UID: \"f80e4ab0-c323-4894-8d2d-23d6e380aeb1\") " Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.630538 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e4ab0-c323-4894-8d2d-23d6e380aeb1-kube-api-access-fhlwt" (OuterVolumeSpecName: "kube-api-access-fhlwt") pod "f80e4ab0-c323-4894-8d2d-23d6e380aeb1" (UID: "f80e4ab0-c323-4894-8d2d-23d6e380aeb1"). InnerVolumeSpecName "kube-api-access-fhlwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.727003 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzkdm\" (UniqueName: \"kubernetes.io/projected/f937ae2e-fb2c-4632-b030-e4547cb604bf-kube-api-access-pzkdm\") pod \"f937ae2e-fb2c-4632-b030-e4547cb604bf\" (UID: \"f937ae2e-fb2c-4632-b030-e4547cb604bf\") " Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.727356 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhlwt\" (UniqueName: \"kubernetes.io/projected/f80e4ab0-c323-4894-8d2d-23d6e380aeb1-kube-api-access-fhlwt\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.730045 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f937ae2e-fb2c-4632-b030-e4547cb604bf-kube-api-access-pzkdm" (OuterVolumeSpecName: "kube-api-access-pzkdm") pod "f937ae2e-fb2c-4632-b030-e4547cb604bf" (UID: "f937ae2e-fb2c-4632-b030-e4547cb604bf"). InnerVolumeSpecName "kube-api-access-pzkdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:12 crc kubenswrapper[4574]: I1004 05:04:12.828924 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzkdm\" (UniqueName: \"kubernetes.io/projected/f937ae2e-fb2c-4632-b030-e4547cb604bf-kube-api-access-pzkdm\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.237348 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b96fh" event={"ID":"f80e4ab0-c323-4894-8d2d-23d6e380aeb1","Type":"ContainerDied","Data":"d4e8afff3d1908cf7135e35193847f20acc467abb9e79a9086927af558dc2ee7"} Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.237704 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e8afff3d1908cf7135e35193847f20acc467abb9e79a9086927af558dc2ee7" Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.237377 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b96fh" Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.239295 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2ssvl" Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.239305 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2ssvl" event={"ID":"f937ae2e-fb2c-4632-b030-e4547cb604bf","Type":"ContainerDied","Data":"f777cca7511251879d483cfd060366a339100443436c44b3b3b5df53d9f00598"} Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.239369 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f777cca7511251879d483cfd060366a339100443436c44b3b3b5df53d9f00598" Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.241370 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g5d9n" event={"ID":"65e6f397-f39a-4b3b-af37-05020e371987","Type":"ContainerStarted","Data":"13c5bc57cfc1b6560ec5815abafe4f86fb7c5bb8683656f772c837524befe1e5"} Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.248080 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"060102888e6d00e30bc0d07219c74678db04106219134a3ad12e91653a5f02ca"} Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.248414 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"8a19f1f100c8d5f373a551bb24ddc01416ccf9ca525a2786476143b2750c1196"} Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.248641 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"616161bdbbfe9d4ed31bb4e22d9931aaceac6d42075db852a4d1545062be0ff7"} Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.248714 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"a2c4aeb5b485bf68e713c6ee7ea249663ffe9a4f1a127cd26f773d5aace7b0e0"} Oct 04 05:04:13 crc kubenswrapper[4574]: E1004 05:04:13.248851 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-xnbbl" podUID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" Oct 04 05:04:13 crc kubenswrapper[4574]: I1004 05:04:13.301810 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-g5d9n" podStartSLOduration=3.036810661 podStartE2EDuration="17.301794579s" podCreationTimestamp="2025-10-04 05:03:56 +0000 UTC" firstStartedPulling="2025-10-04 05:03:58.160848772 +0000 UTC m=+1064.014991814" lastFinishedPulling="2025-10-04 05:04:12.42583269 +0000 UTC m=+1078.279975732" observedRunningTime="2025-10-04 05:04:13.276816773 +0000 UTC m=+1079.130959815" watchObservedRunningTime="2025-10-04 05:04:13.301794579 +0000 UTC m=+1079.155937621" Oct 04 05:04:15 crc kubenswrapper[4574]: I1004 05:04:15.289364 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"a362648121f38add79803ad0997b257a5ed9d359eb71fe52d6bad61f0c7ec060"} Oct 04 05:04:15 crc kubenswrapper[4574]: I1004 05:04:15.290134 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"50a6e207763e4cad59705ebee75425f823def6749f1682d67bab442217c4eb5e"} Oct 04 05:04:15 crc kubenswrapper[4574]: I1004 05:04:15.290193 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"3f8464d5335d1dc02d10d06d90ba2468024ad68ac3162d8308bc37fd240c8fe9"} Oct 04 05:04:15 crc kubenswrapper[4574]: I1004 05:04:15.290209 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"759bd07f70c7e31d3739e0330e3fa96982bc876f94cdc2abf9bd33b3e901da99"} Oct 04 05:04:15 crc kubenswrapper[4574]: I1004 05:04:15.290223 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"597ebea037e88a04d73782d133980384e87aac34ea7a8d5fd5c0aaee51445cd0"} Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.302510 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"5dc7d98151a1e51744c9bf72c5b438d2d49768332b614979640ffee79415d5d3"} Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.302824 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"74b762df-991e-4e0c-9be6-c3e468408254","Type":"ContainerStarted","Data":"d0ef19bca53796c511115e391e257eafae277279d3f792e4b04e11337cab6459"} Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.338055 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.862161472 podStartE2EDuration="56.338037273s" podCreationTimestamp="2025-10-04 05:03:20 +0000 UTC" firstStartedPulling="2025-10-04 05:03:54.695629277 +0000 UTC m=+1060.549772359" lastFinishedPulling="2025-10-04 05:04:14.171505118 +0000 UTC m=+1080.025648160" observedRunningTime="2025-10-04 05:04:16.330057274 +0000 UTC m=+1082.184200316" watchObservedRunningTime="2025-10-04 05:04:16.338037273 +0000 UTC m=+1082.192180315" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599136 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dp6q9"] Oct 04 05:04:16 crc kubenswrapper[4574]: E1004 05:04:16.599545 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6440c95c-883f-4d9e-b095-6589637f1059" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599567 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6440c95c-883f-4d9e-b095-6589637f1059" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: E1004 05:04:16.599592 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4606794e-3d8f-4aa5-80fb-c918ffd40937" containerName="ovn-config" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599601 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4606794e-3d8f-4aa5-80fb-c918ffd40937" containerName="ovn-config" Oct 04 05:04:16 crc kubenswrapper[4574]: E1004 05:04:16.599616 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e4ab0-c323-4894-8d2d-23d6e380aeb1" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599625 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e4ab0-c323-4894-8d2d-23d6e380aeb1" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: E1004 05:04:16.599657 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f937ae2e-fb2c-4632-b030-e4547cb604bf" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599665 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f937ae2e-fb2c-4632-b030-e4547cb604bf" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599847 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f937ae2e-fb2c-4632-b030-e4547cb604bf" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599887 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="4606794e-3d8f-4aa5-80fb-c918ffd40937" containerName="ovn-config" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599903 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="6440c95c-883f-4d9e-b095-6589637f1059" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.599916 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80e4ab0-c323-4894-8d2d-23d6e380aeb1" containerName="mariadb-database-create" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.600968 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.603424 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.659741 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dp6q9"] Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.699614 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-32a0-account-create-mtnfl"] Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.700714 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703170 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703542 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-config\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703639 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjltt\" (UniqueName: \"kubernetes.io/projected/2b2782b0-a68d-433c-ab80-d76f353a7539-kube-api-access-rjltt\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703677 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703751 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703816 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.703863 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.712611 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-32a0-account-create-mtnfl"] Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.805706 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.805781 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.805866 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-config\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.805894 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txprd\" (UniqueName: \"kubernetes.io/projected/a91fda1d-56f0-41a6-af12-1bf124f53361-kube-api-access-txprd\") pod \"neutron-32a0-account-create-mtnfl\" (UID: \"a91fda1d-56f0-41a6-af12-1bf124f53361\") " pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.805978 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjltt\" (UniqueName: \"kubernetes.io/projected/2b2782b0-a68d-433c-ab80-d76f353a7539-kube-api-access-rjltt\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.806005 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.806066 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.806691 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.807873 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.808301 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.808744 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-config\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.809097 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.835346 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjltt\" (UniqueName: \"kubernetes.io/projected/2b2782b0-a68d-433c-ab80-d76f353a7539-kube-api-access-rjltt\") pod \"dnsmasq-dns-77585f5f8c-dp6q9\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.908137 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txprd\" (UniqueName: \"kubernetes.io/projected/a91fda1d-56f0-41a6-af12-1bf124f53361-kube-api-access-txprd\") pod \"neutron-32a0-account-create-mtnfl\" (UID: \"a91fda1d-56f0-41a6-af12-1bf124f53361\") " pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.916686 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:16 crc kubenswrapper[4574]: I1004 05:04:16.928359 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txprd\" (UniqueName: \"kubernetes.io/projected/a91fda1d-56f0-41a6-af12-1bf124f53361-kube-api-access-txprd\") pod \"neutron-32a0-account-create-mtnfl\" (UID: \"a91fda1d-56f0-41a6-af12-1bf124f53361\") " pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:17 crc kubenswrapper[4574]: I1004 05:04:17.015398 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:17 crc kubenswrapper[4574]: I1004 05:04:17.312530 4574 generic.go:334] "Generic (PLEG): container finished" podID="65e6f397-f39a-4b3b-af37-05020e371987" containerID="13c5bc57cfc1b6560ec5815abafe4f86fb7c5bb8683656f772c837524befe1e5" exitCode=0 Oct 04 05:04:17 crc kubenswrapper[4574]: I1004 05:04:17.313661 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g5d9n" event={"ID":"65e6f397-f39a-4b3b-af37-05020e371987","Type":"ContainerDied","Data":"13c5bc57cfc1b6560ec5815abafe4f86fb7c5bb8683656f772c837524befe1e5"} Oct 04 05:04:17 crc kubenswrapper[4574]: I1004 05:04:17.392425 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dp6q9"] Oct 04 05:04:17 crc kubenswrapper[4574]: W1004 05:04:17.395402 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b2782b0_a68d_433c_ab80_d76f353a7539.slice/crio-be40340adf858e1aa86bd1317ec0ba1445f35cb700d88ba61e7e1229ce0a144e WatchSource:0}: Error finding container be40340adf858e1aa86bd1317ec0ba1445f35cb700d88ba61e7e1229ce0a144e: Status 404 returned error can't find the container with id be40340adf858e1aa86bd1317ec0ba1445f35cb700d88ba61e7e1229ce0a144e Oct 04 05:04:17 crc kubenswrapper[4574]: I1004 05:04:17.485032 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-32a0-account-create-mtnfl"] Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.320924 4574 generic.go:334] "Generic (PLEG): container finished" podID="a91fda1d-56f0-41a6-af12-1bf124f53361" containerID="427a4532ebf2cdf01c075865843888e50ff79eb65d8671c95d848efd7ede07da" exitCode=0 Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.321005 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32a0-account-create-mtnfl" event={"ID":"a91fda1d-56f0-41a6-af12-1bf124f53361","Type":"ContainerDied","Data":"427a4532ebf2cdf01c075865843888e50ff79eb65d8671c95d848efd7ede07da"} Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.321031 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32a0-account-create-mtnfl" event={"ID":"a91fda1d-56f0-41a6-af12-1bf124f53361","Type":"ContainerStarted","Data":"6a95abed2e38eb1319f86c352592e6e06f39cc914b5a16d33160f83fa16a1c57"} Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.322795 4574 generic.go:334] "Generic (PLEG): container finished" podID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerID="556fc54546e6129e599002d6f4d7d99356343e6c841867a694792ab6eb979a3d" exitCode=0 Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.322905 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" event={"ID":"2b2782b0-a68d-433c-ab80-d76f353a7539","Type":"ContainerDied","Data":"556fc54546e6129e599002d6f4d7d99356343e6c841867a694792ab6eb979a3d"} Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.322982 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" event={"ID":"2b2782b0-a68d-433c-ab80-d76f353a7539","Type":"ContainerStarted","Data":"be40340adf858e1aa86bd1317ec0ba1445f35cb700d88ba61e7e1229ce0a144e"} Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.660035 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.743474 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-config-data\") pod \"65e6f397-f39a-4b3b-af37-05020e371987\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.743519 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sk2k\" (UniqueName: \"kubernetes.io/projected/65e6f397-f39a-4b3b-af37-05020e371987-kube-api-access-5sk2k\") pod \"65e6f397-f39a-4b3b-af37-05020e371987\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.743634 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-combined-ca-bundle\") pod \"65e6f397-f39a-4b3b-af37-05020e371987\" (UID: \"65e6f397-f39a-4b3b-af37-05020e371987\") " Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.748614 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e6f397-f39a-4b3b-af37-05020e371987-kube-api-access-5sk2k" (OuterVolumeSpecName: "kube-api-access-5sk2k") pod "65e6f397-f39a-4b3b-af37-05020e371987" (UID: "65e6f397-f39a-4b3b-af37-05020e371987"). InnerVolumeSpecName "kube-api-access-5sk2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.775469 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65e6f397-f39a-4b3b-af37-05020e371987" (UID: "65e6f397-f39a-4b3b-af37-05020e371987"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.795568 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-config-data" (OuterVolumeSpecName: "config-data") pod "65e6f397-f39a-4b3b-af37-05020e371987" (UID: "65e6f397-f39a-4b3b-af37-05020e371987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.845277 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.845312 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sk2k\" (UniqueName: \"kubernetes.io/projected/65e6f397-f39a-4b3b-af37-05020e371987-kube-api-access-5sk2k\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:18 crc kubenswrapper[4574]: I1004 05:04:18.845325 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e6f397-f39a-4b3b-af37-05020e371987-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.333492 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g5d9n" event={"ID":"65e6f397-f39a-4b3b-af37-05020e371987","Type":"ContainerDied","Data":"c83590f1f3635aaa026cbb51b67d3858d226eb87c4704bd9a9989838004f41f8"} Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.333795 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c83590f1f3635aaa026cbb51b67d3858d226eb87c4704bd9a9989838004f41f8" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.333700 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g5d9n" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.336835 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" event={"ID":"2b2782b0-a68d-433c-ab80-d76f353a7539","Type":"ContainerStarted","Data":"dd501a35bccc75a1037fc28b945335738152a16aeda9cf8693c1695017cc3a5f"} Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.371955 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" podStartSLOduration=3.37193532 podStartE2EDuration="3.37193532s" podCreationTimestamp="2025-10-04 05:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:19.364797856 +0000 UTC m=+1085.218940898" watchObservedRunningTime="2025-10-04 05:04:19.37193532 +0000 UTC m=+1085.226078352" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.408138 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.408216 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.558411 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k5m4r"] Oct 04 05:04:19 crc kubenswrapper[4574]: E1004 05:04:19.566364 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e6f397-f39a-4b3b-af37-05020e371987" containerName="keystone-db-sync" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.566409 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e6f397-f39a-4b3b-af37-05020e371987" containerName="keystone-db-sync" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.566650 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e6f397-f39a-4b3b-af37-05020e371987" containerName="keystone-db-sync" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.567434 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.580309 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.580471 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.580582 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-49xhf" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.580667 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.621752 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dp6q9"] Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.636407 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k5m4r"] Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.662643 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-scripts\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.662710 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vck\" (UniqueName: \"kubernetes.io/projected/588695e8-2faf-4c61-bb2c-0caa5257f0eb-kube-api-access-99vck\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.662758 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-config-data\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.662805 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-fernet-keys\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.662851 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-credential-keys\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.662879 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-combined-ca-bundle\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.677825 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wxsxg"] Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.679338 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.720396 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wxsxg"] Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770004 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-config-data\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770087 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-fernet-keys\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770131 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-credential-keys\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770170 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-combined-ca-bundle\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770212 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770359 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770407 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-config\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770466 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djlmk\" (UniqueName: \"kubernetes.io/projected/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-kube-api-access-djlmk\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770500 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-svc\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770530 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770565 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-scripts\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.770605 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vck\" (UniqueName: \"kubernetes.io/projected/588695e8-2faf-4c61-bb2c-0caa5257f0eb-kube-api-access-99vck\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.779429 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-config-data\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.782316 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-combined-ca-bundle\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.782713 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-fernet-keys\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.783079 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-credential-keys\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.787521 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-scripts\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.814109 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vck\" (UniqueName: \"kubernetes.io/projected/588695e8-2faf-4c61-bb2c-0caa5257f0eb-kube-api-access-99vck\") pod \"keystone-bootstrap-k5m4r\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.875495 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.875818 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.875854 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-config\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.875907 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djlmk\" (UniqueName: \"kubernetes.io/projected/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-kube-api-access-djlmk\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.875945 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-svc\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.875979 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.877015 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.878964 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.879541 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.879911 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-config\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.880051 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-svc\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.921865 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:19 crc kubenswrapper[4574]: I1004 05:04:19.955197 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.076360 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6978f45d5c-g5q99"] Oct 04 05:04:20 crc kubenswrapper[4574]: E1004 05:04:20.076886 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91fda1d-56f0-41a6-af12-1bf124f53361" containerName="mariadb-account-create" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.076903 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91fda1d-56f0-41a6-af12-1bf124f53361" containerName="mariadb-account-create" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.077111 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91fda1d-56f0-41a6-af12-1bf124f53361" containerName="mariadb-account-create" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.079344 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txprd\" (UniqueName: \"kubernetes.io/projected/a91fda1d-56f0-41a6-af12-1bf124f53361-kube-api-access-txprd\") pod \"a91fda1d-56f0-41a6-af12-1bf124f53361\" (UID: \"a91fda1d-56f0-41a6-af12-1bf124f53361\") " Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.084536 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.088484 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91fda1d-56f0-41a6-af12-1bf124f53361-kube-api-access-txprd" (OuterVolumeSpecName: "kube-api-access-txprd") pod "a91fda1d-56f0-41a6-af12-1bf124f53361" (UID: "a91fda1d-56f0-41a6-af12-1bf124f53361"). InnerVolumeSpecName "kube-api-access-txprd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.108715 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.108963 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.109124 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-dlvtr" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.111607 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.173694 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6978f45d5c-g5q99"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.181595 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-config-data\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.181706 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9zs7\" (UniqueName: \"kubernetes.io/projected/de90c2bd-086a-4b9b-846e-048709c26ede-kube-api-access-c9zs7\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.181839 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de90c2bd-086a-4b9b-846e-048709c26ede-logs\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.181881 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de90c2bd-086a-4b9b-846e-048709c26ede-horizon-secret-key\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.181916 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-scripts\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.181985 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txprd\" (UniqueName: \"kubernetes.io/projected/a91fda1d-56f0-41a6-af12-1bf124f53361-kube-api-access-txprd\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.282949 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djlmk\" (UniqueName: \"kubernetes.io/projected/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-kube-api-access-djlmk\") pod \"dnsmasq-dns-55fff446b9-wxsxg\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.284533 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de90c2bd-086a-4b9b-846e-048709c26ede-logs\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.284610 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de90c2bd-086a-4b9b-846e-048709c26ede-horizon-secret-key\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.284651 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-scripts\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.284680 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-config-data\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.284762 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9zs7\" (UniqueName: \"kubernetes.io/projected/de90c2bd-086a-4b9b-846e-048709c26ede-kube-api-access-c9zs7\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.288042 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-scripts\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.288700 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de90c2bd-086a-4b9b-846e-048709c26ede-logs\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.289401 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-config-data\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.294738 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de90c2bd-086a-4b9b-846e-048709c26ede-horizon-secret-key\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.309009 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.382047 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-32a0-account-create-mtnfl" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.383440 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-32a0-account-create-mtnfl" event={"ID":"a91fda1d-56f0-41a6-af12-1bf124f53361","Type":"ContainerDied","Data":"6a95abed2e38eb1319f86c352592e6e06f39cc914b5a16d33160f83fa16a1c57"} Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.383490 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a95abed2e38eb1319f86c352592e6e06f39cc914b5a16d33160f83fa16a1c57" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.383520 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.412547 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-kt5kv"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.414000 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.420479 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wxsxg"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.480367 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z4lvd" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.480543 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.480663 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.483331 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9zs7\" (UniqueName: \"kubernetes.io/projected/de90c2bd-086a-4b9b-846e-048709c26ede-kube-api-access-c9zs7\") pod \"horizon-6978f45d5c-g5q99\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.501600 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-config-data\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.501764 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-scripts\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.501904 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-logs\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.501964 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-combined-ca-bundle\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.501997 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29sqb\" (UniqueName: \"kubernetes.io/projected/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-kube-api-access-29sqb\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.519140 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.543037 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kt5kv"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.565904 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57f9bc9c8f-q72p7"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.580867 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613135 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6970e30d-161e-4b7f-bcff-81882edd065f-logs\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613193 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-config-data\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613221 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-combined-ca-bundle\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613256 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29sqb\" (UniqueName: \"kubernetes.io/projected/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-kube-api-access-29sqb\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613289 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-config-data\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613313 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-scripts\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613350 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-scripts\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613386 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv8w6\" (UniqueName: \"kubernetes.io/projected/6970e30d-161e-4b7f-bcff-81882edd065f-kube-api-access-mv8w6\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613440 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6970e30d-161e-4b7f-bcff-81882edd065f-horizon-secret-key\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613483 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-logs\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.613976 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-logs\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.629342 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-scripts\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.640599 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-config-data\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: E1004 05:04:20.641062 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda91fda1d_56f0_41a6_af12_1bf124f53361.slice\": RecentStats: unable to find data in memory cache]" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.656070 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-combined-ca-bundle\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.658076 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f9bc9c8f-q72p7"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.703886 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29sqb\" (UniqueName: \"kubernetes.io/projected/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-kube-api-access-29sqb\") pod \"placement-db-sync-kt5kv\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.715517 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6970e30d-161e-4b7f-bcff-81882edd065f-horizon-secret-key\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.715811 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6970e30d-161e-4b7f-bcff-81882edd065f-logs\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.715927 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-config-data\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.716085 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-scripts\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.716271 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv8w6\" (UniqueName: \"kubernetes.io/projected/6970e30d-161e-4b7f-bcff-81882edd065f-kube-api-access-mv8w6\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.716651 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6970e30d-161e-4b7f-bcff-81882edd065f-logs\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.717305 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-wfxkf"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.719464 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-scripts\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.719807 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-config-data\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.719939 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.725743 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6970e30d-161e-4b7f-bcff-81882edd065f-horizon-secret-key\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.772739 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv8w6\" (UniqueName: \"kubernetes.io/projected/6970e30d-161e-4b7f-bcff-81882edd065f-kube-api-access-mv8w6\") pod \"horizon-57f9bc9c8f-q72p7\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.777863 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.797022 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.802912 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-wfxkf"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.802947 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.803028 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.808284 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.808526 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.817926 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.818004 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.818028 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.818105 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxnd6\" (UniqueName: \"kubernetes.io/projected/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-kube-api-access-kxnd6\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.818122 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.818139 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-config\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.920091 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.920508 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921021 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921047 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921065 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921119 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-scripts\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921348 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgb6x\" (UniqueName: \"kubernetes.io/projected/f7cb1e07-7587-4b93-bf2f-a8229038b290-kube-api-access-vgb6x\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921366 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxnd6\" (UniqueName: \"kubernetes.io/projected/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-kube-api-access-kxnd6\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921384 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921400 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-config\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921468 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921504 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-config-data\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.921529 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.923223 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.923891 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.924777 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.925409 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-config\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.925527 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.934647 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:20 crc kubenswrapper[4574]: I1004 05:04:20.962056 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxnd6\" (UniqueName: \"kubernetes.io/projected/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-kube-api-access-kxnd6\") pod \"dnsmasq-dns-76fcf4b695-wfxkf\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.022837 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgb6x\" (UniqueName: \"kubernetes.io/projected/f7cb1e07-7587-4b93-bf2f-a8229038b290-kube-api-access-vgb6x\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.022912 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.022982 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-config-data\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.023005 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.023048 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.023070 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.023123 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-scripts\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.024856 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.026319 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.033533 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.038141 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-config-data\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.038204 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-scripts\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.041475 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.063679 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgb6x\" (UniqueName: \"kubernetes.io/projected/f7cb1e07-7587-4b93-bf2f-a8229038b290-kube-api-access-vgb6x\") pod \"ceilometer-0\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " pod="openstack/ceilometer-0" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.077753 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.189924 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k5m4r"] Oct 04 05:04:21 crc kubenswrapper[4574]: I1004 05:04:21.251395 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:21.405431 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5m4r" event={"ID":"588695e8-2faf-4c61-bb2c-0caa5257f0eb","Type":"ContainerStarted","Data":"898a8cf75869c9806b26bf70ed160cb1f71b783c3d31b45c5e4adc541e6622ba"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:21.405589 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerName="dnsmasq-dns" containerID="cri-o://dd501a35bccc75a1037fc28b945335738152a16aeda9cf8693c1695017cc3a5f" gracePeriod=10 Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:21.469302 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wxsxg"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:21.491141 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6978f45d5c-g5q99"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:21.666884 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kt5kv"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.004767 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8c9r7"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.006343 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.010786 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.011029 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.011341 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7qmf7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.037555 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8c9r7"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.069869 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-config\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.070019 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-combined-ca-bundle\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.070061 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t8qv\" (UniqueName: \"kubernetes.io/projected/ad26bb6b-4342-4bfc-89b0-bb562b16af11-kube-api-access-6t8qv\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.171050 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-combined-ca-bundle\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.171099 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t8qv\" (UniqueName: \"kubernetes.io/projected/ad26bb6b-4342-4bfc-89b0-bb562b16af11-kube-api-access-6t8qv\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.171170 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-config\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.176999 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-config\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.181186 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-combined-ca-bundle\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.189991 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t8qv\" (UniqueName: \"kubernetes.io/projected/ad26bb6b-4342-4bfc-89b0-bb562b16af11-kube-api-access-6t8qv\") pod \"neutron-db-sync-8c9r7\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.339719 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.443074 4574 generic.go:334] "Generic (PLEG): container finished" podID="2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" containerID="f3ba1b5f606e3307b09eeb2474f6d8d28f9efcb9bb32bdce485311c61fe85bad" exitCode=0 Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.443135 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" event={"ID":"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0","Type":"ContainerDied","Data":"f3ba1b5f606e3307b09eeb2474f6d8d28f9efcb9bb32bdce485311c61fe85bad"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.443162 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" event={"ID":"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0","Type":"ContainerStarted","Data":"fa301c4738d90308158cb2c549bca29c4a25821edbb9a6923014582c864b264f"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.460278 4574 generic.go:334] "Generic (PLEG): container finished" podID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerID="dd501a35bccc75a1037fc28b945335738152a16aeda9cf8693c1695017cc3a5f" exitCode=0 Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.460332 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" event={"ID":"2b2782b0-a68d-433c-ab80-d76f353a7539","Type":"ContainerDied","Data":"dd501a35bccc75a1037fc28b945335738152a16aeda9cf8693c1695017cc3a5f"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.461693 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5m4r" event={"ID":"588695e8-2faf-4c61-bb2c-0caa5257f0eb","Type":"ContainerStarted","Data":"c27c879b5bfc700a69f9da14a6e8697a92671c4ea9d95c59db59f6cb3b3ab5e7"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.466718 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978f45d5c-g5q99" event={"ID":"de90c2bd-086a-4b9b-846e-048709c26ede","Type":"ContainerStarted","Data":"886b7ef249256ec7aedaf92b6fd1c5b0540225fd1efe8362900b42e4911cc236"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.495510 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kt5kv" event={"ID":"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5","Type":"ContainerStarted","Data":"cffa1a6d1d733ae188494049b46109ce8c87c2e9782ccaab6151690a209103f3"} Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.553660 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k5m4r" podStartSLOduration=3.553639546 podStartE2EDuration="3.553639546s" podCreationTimestamp="2025-10-04 05:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:22.545006828 +0000 UTC m=+1088.399149870" watchObservedRunningTime="2025-10-04 05:04:22.553639546 +0000 UTC m=+1088.407782588" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.636317 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6978f45d5c-g5q99"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.657501 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f8dc79ff-xkm7j"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.659225 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.689791 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f9bc9c8f-q72p7"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.707151 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-wfxkf"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.712783 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f8dc79ff-xkm7j"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.717192 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-scripts\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.717309 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/271a7436-a272-479c-9473-decd7e54d73b-horizon-secret-key\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.717334 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271a7436-a272-479c-9473-decd7e54d73b-logs\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.717391 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-config-data\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.717424 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw6z\" (UniqueName: \"kubernetes.io/projected/271a7436-a272-479c-9473-decd7e54d73b-kube-api-access-jvw6z\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: W1004 05:04:22.728431 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod592f8aa2_58a2_4c6d_b0d5_25c688ccf382.slice/crio-74d6959e80b98a1aa80d9d23aa27691f3bacd49e86a39f83b6cf25f86677ff91 WatchSource:0}: Error finding container 74d6959e80b98a1aa80d9d23aa27691f3bacd49e86a39f83b6cf25f86677ff91: Status 404 returned error can't find the container with id 74d6959e80b98a1aa80d9d23aa27691f3bacd49e86a39f83b6cf25f86677ff91 Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.817158 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.818867 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-config-data\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.819604 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw6z\" (UniqueName: \"kubernetes.io/projected/271a7436-a272-479c-9473-decd7e54d73b-kube-api-access-jvw6z\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.820459 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-config-data\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.820520 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-scripts\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.820845 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/271a7436-a272-479c-9473-decd7e54d73b-horizon-secret-key\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.820871 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271a7436-a272-479c-9473-decd7e54d73b-logs\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.826664 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271a7436-a272-479c-9473-decd7e54d73b-logs\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.832859 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-scripts\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.866945 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw6z\" (UniqueName: \"kubernetes.io/projected/271a7436-a272-479c-9473-decd7e54d73b-kube-api-access-jvw6z\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.869384 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/271a7436-a272-479c-9473-decd7e54d73b-horizon-secret-key\") pod \"horizon-6f8dc79ff-xkm7j\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.922299 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-swift-storage-0\") pod \"2b2782b0-a68d-433c-ab80-d76f353a7539\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.922408 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-nb\") pod \"2b2782b0-a68d-433c-ab80-d76f353a7539\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.922464 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-config\") pod \"2b2782b0-a68d-433c-ab80-d76f353a7539\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.922534 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjltt\" (UniqueName: \"kubernetes.io/projected/2b2782b0-a68d-433c-ab80-d76f353a7539-kube-api-access-rjltt\") pod \"2b2782b0-a68d-433c-ab80-d76f353a7539\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.922566 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-sb\") pod \"2b2782b0-a68d-433c-ab80-d76f353a7539\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.922619 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-svc\") pod \"2b2782b0-a68d-433c-ab80-d76f353a7539\" (UID: \"2b2782b0-a68d-433c-ab80-d76f353a7539\") " Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.962711 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:04:22 crc kubenswrapper[4574]: I1004 05:04:22.980118 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2782b0-a68d-433c-ab80-d76f353a7539-kube-api-access-rjltt" (OuterVolumeSpecName: "kube-api-access-rjltt") pod "2b2782b0-a68d-433c-ab80-d76f353a7539" (UID: "2b2782b0-a68d-433c-ab80-d76f353a7539"). InnerVolumeSpecName "kube-api-access-rjltt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.025166 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjltt\" (UniqueName: \"kubernetes.io/projected/2b2782b0-a68d-433c-ab80-d76f353a7539-kube-api-access-rjltt\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.084310 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.119342 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.125084 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b2782b0-a68d-433c-ab80-d76f353a7539" (UID: "2b2782b0-a68d-433c-ab80-d76f353a7539"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.131211 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-config" (OuterVolumeSpecName: "config") pod "2b2782b0-a68d-433c-ab80-d76f353a7539" (UID: "2b2782b0-a68d-433c-ab80-d76f353a7539"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.131805 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b2782b0-a68d-433c-ab80-d76f353a7539" (UID: "2b2782b0-a68d-433c-ab80-d76f353a7539"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.133471 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.133498 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.133539 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.153087 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:23 crc kubenswrapper[4574]: W1004 05:04:23.197639 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7cb1e07_7587_4b93_bf2f_a8229038b290.slice/crio-365ba33ba08a7f9792e3356815ca11b94f454e6062fd4f1704e9fe0502b6f616 WatchSource:0}: Error finding container 365ba33ba08a7f9792e3356815ca11b94f454e6062fd4f1704e9fe0502b6f616: Status 404 returned error can't find the container with id 365ba33ba08a7f9792e3356815ca11b94f454e6062fd4f1704e9fe0502b6f616 Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.219273 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b2782b0-a68d-433c-ab80-d76f353a7539" (UID: "2b2782b0-a68d-433c-ab80-d76f353a7539"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.228071 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b2782b0-a68d-433c-ab80-d76f353a7539" (UID: "2b2782b0-a68d-433c-ab80-d76f353a7539"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.236402 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-svc\") pod \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.236522 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-swift-storage-0\") pod \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.236683 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-config\") pod \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.236749 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-sb\") pod \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.237194 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djlmk\" (UniqueName: \"kubernetes.io/projected/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-kube-api-access-djlmk\") pod \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.237612 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-nb\") pod \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\" (UID: \"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0\") " Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.238629 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.238655 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2782b0-a68d-433c-ab80-d76f353a7539-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.267222 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8c9r7"] Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.282326 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-kube-api-access-djlmk" (OuterVolumeSpecName: "kube-api-access-djlmk") pod "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" (UID: "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0"). InnerVolumeSpecName "kube-api-access-djlmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: W1004 05:04:23.306159 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad26bb6b_4342_4bfc_89b0_bb562b16af11.slice/crio-deb142936eaac676139f54a39f8129dacd27c1869d329db42dcc9de3eb2fe886 WatchSource:0}: Error finding container deb142936eaac676139f54a39f8129dacd27c1869d329db42dcc9de3eb2fe886: Status 404 returned error can't find the container with id deb142936eaac676139f54a39f8129dacd27c1869d329db42dcc9de3eb2fe886 Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.341495 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djlmk\" (UniqueName: \"kubernetes.io/projected/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-kube-api-access-djlmk\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.362996 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" (UID: "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.376393 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" (UID: "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.384866 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-config" (OuterVolumeSpecName: "config") pod "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" (UID: "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.410888 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" (UID: "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.413193 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" (UID: "2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.444666 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.444691 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.444703 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.444712 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.444720 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.510808 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8c9r7" event={"ID":"ad26bb6b-4342-4bfc-89b0-bb562b16af11","Type":"ContainerStarted","Data":"deb142936eaac676139f54a39f8129dacd27c1869d329db42dcc9de3eb2fe886"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.516631 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.516746 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-dp6q9" event={"ID":"2b2782b0-a68d-433c-ab80-d76f353a7539","Type":"ContainerDied","Data":"be40340adf858e1aa86bd1317ec0ba1445f35cb700d88ba61e7e1229ce0a144e"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.516811 4574 scope.go:117] "RemoveContainer" containerID="dd501a35bccc75a1037fc28b945335738152a16aeda9cf8693c1695017cc3a5f" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.520415 4574 generic.go:334] "Generic (PLEG): container finished" podID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerID="122da771d0cb8ada6112f2727c6c5cb4446022f39db47a10923c65bbc6fa4965" exitCode=0 Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.520470 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" event={"ID":"592f8aa2-58a2-4c6d-b0d5-25c688ccf382","Type":"ContainerDied","Data":"122da771d0cb8ada6112f2727c6c5cb4446022f39db47a10923c65bbc6fa4965"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.520534 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" event={"ID":"592f8aa2-58a2-4c6d-b0d5-25c688ccf382","Type":"ContainerStarted","Data":"74d6959e80b98a1aa80d9d23aa27691f3bacd49e86a39f83b6cf25f86677ff91"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.525750 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f9bc9c8f-q72p7" event={"ID":"6970e30d-161e-4b7f-bcff-81882edd065f","Type":"ContainerStarted","Data":"46e802ddcde6bc52cfcd123c20e1109784fa16931c281eb77982ca1315aac598"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.536709 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" event={"ID":"2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0","Type":"ContainerDied","Data":"fa301c4738d90308158cb2c549bca29c4a25821edbb9a6923014582c864b264f"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.536810 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wxsxg" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.539602 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerStarted","Data":"365ba33ba08a7f9792e3356815ca11b94f454e6062fd4f1704e9fe0502b6f616"} Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.589194 4574 scope.go:117] "RemoveContainer" containerID="556fc54546e6129e599002d6f4d7d99356343e6c841867a694792ab6eb979a3d" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.621978 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dp6q9"] Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.654247 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-dp6q9"] Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.685879 4574 scope.go:117] "RemoveContainer" containerID="f3ba1b5f606e3307b09eeb2474f6d8d28f9efcb9bb32bdce485311c61fe85bad" Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.740296 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wxsxg"] Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.753024 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wxsxg"] Oct 04 05:04:23 crc kubenswrapper[4574]: I1004 05:04:23.762093 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f8dc79ff-xkm7j"] Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.557657 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8c9r7" event={"ID":"ad26bb6b-4342-4bfc-89b0-bb562b16af11","Type":"ContainerStarted","Data":"255ba195d8c5a3f6521995b6b5d51b8d8c42f900cf7009d5dee3896acc9b68fb"} Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.584862 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" event={"ID":"592f8aa2-58a2-4c6d-b0d5-25c688ccf382","Type":"ContainerStarted","Data":"9996ddb2a7c43991e68612cd13d2b5b7095f1eb04d31dbac97dfb32af1bee999"} Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.584904 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.590454 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8c9r7" podStartSLOduration=3.59042873 podStartE2EDuration="3.59042873s" podCreationTimestamp="2025-10-04 05:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:24.585093377 +0000 UTC m=+1090.439236419" watchObservedRunningTime="2025-10-04 05:04:24.59042873 +0000 UTC m=+1090.444571772" Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.598465 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8dc79ff-xkm7j" event={"ID":"271a7436-a272-479c-9473-decd7e54d73b","Type":"ContainerStarted","Data":"13bf89a0d72254c8ba0918da78b1bd4177a6d5a866f0edb9a3089e29a103c088"} Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.608263 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" podStartSLOduration=4.608226921 podStartE2EDuration="4.608226921s" podCreationTimestamp="2025-10-04 05:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:24.603942708 +0000 UTC m=+1090.458085750" watchObservedRunningTime="2025-10-04 05:04:24.608226921 +0000 UTC m=+1090.462369963" Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.757877 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" path="/var/lib/kubelet/pods/2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0/volumes" Oct 04 05:04:24 crc kubenswrapper[4574]: I1004 05:04:24.758573 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" path="/var/lib/kubelet/pods/2b2782b0-a68d-433c-ab80-d76f353a7539/volumes" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.488748 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d026-account-create-lttsc"] Oct 04 05:04:26 crc kubenswrapper[4574]: E1004 05:04:26.489508 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" containerName="init" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.489523 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" containerName="init" Oct 04 05:04:26 crc kubenswrapper[4574]: E1004 05:04:26.489539 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerName="init" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.489545 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerName="init" Oct 04 05:04:26 crc kubenswrapper[4574]: E1004 05:04:26.489568 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerName="dnsmasq-dns" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.489576 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerName="dnsmasq-dns" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.489744 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abb8ac0-16d9-49b7-9ea9-4aa36a40c4a0" containerName="init" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.489764 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2782b0-a68d-433c-ab80-d76f353a7539" containerName="dnsmasq-dns" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.490411 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.492688 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.522908 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwlp5\" (UniqueName: \"kubernetes.io/projected/3c8f6604-3234-44f9-8d5f-24c945ccc8ae-kube-api-access-gwlp5\") pod \"barbican-d026-account-create-lttsc\" (UID: \"3c8f6604-3234-44f9-8d5f-24c945ccc8ae\") " pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.523171 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d026-account-create-lttsc"] Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.608915 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-467d-account-create-vc9zc"] Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.610633 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.618066 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.627064 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwlp5\" (UniqueName: \"kubernetes.io/projected/3c8f6604-3234-44f9-8d5f-24c945ccc8ae-kube-api-access-gwlp5\") pod \"barbican-d026-account-create-lttsc\" (UID: \"3c8f6604-3234-44f9-8d5f-24c945ccc8ae\") " pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.641706 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-467d-account-create-vc9zc"] Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.676878 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwlp5\" (UniqueName: \"kubernetes.io/projected/3c8f6604-3234-44f9-8d5f-24c945ccc8ae-kube-api-access-gwlp5\") pod \"barbican-d026-account-create-lttsc\" (UID: \"3c8f6604-3234-44f9-8d5f-24c945ccc8ae\") " pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.742901 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zn9g\" (UniqueName: \"kubernetes.io/projected/782fe7ab-749f-434c-ba59-c7ab782dd007-kube-api-access-5zn9g\") pod \"cinder-467d-account-create-vc9zc\" (UID: \"782fe7ab-749f-434c-ba59-c7ab782dd007\") " pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.816409 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.851816 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zn9g\" (UniqueName: \"kubernetes.io/projected/782fe7ab-749f-434c-ba59-c7ab782dd007-kube-api-access-5zn9g\") pod \"cinder-467d-account-create-vc9zc\" (UID: \"782fe7ab-749f-434c-ba59-c7ab782dd007\") " pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.891047 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zn9g\" (UniqueName: \"kubernetes.io/projected/782fe7ab-749f-434c-ba59-c7ab782dd007-kube-api-access-5zn9g\") pod \"cinder-467d-account-create-vc9zc\" (UID: \"782fe7ab-749f-434c-ba59-c7ab782dd007\") " pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:26 crc kubenswrapper[4574]: I1004 05:04:26.942782 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:28 crc kubenswrapper[4574]: I1004 05:04:28.682114 4574 generic.go:334] "Generic (PLEG): container finished" podID="588695e8-2faf-4c61-bb2c-0caa5257f0eb" containerID="c27c879b5bfc700a69f9da14a6e8697a92671c4ea9d95c59db59f6cb3b3ab5e7" exitCode=0 Oct 04 05:04:28 crc kubenswrapper[4574]: I1004 05:04:28.682200 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5m4r" event={"ID":"588695e8-2faf-4c61-bb2c-0caa5257f0eb","Type":"ContainerDied","Data":"c27c879b5bfc700a69f9da14a6e8697a92671c4ea9d95c59db59f6cb3b3ab5e7"} Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.327193 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57f9bc9c8f-q72p7"] Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.364149 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57c7ff446b-7tmwn"] Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.369801 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.378037 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.385783 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57c7ff446b-7tmwn"] Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.438633 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f8dc79ff-xkm7j"] Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.464812 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57bfb4d496-nv6hv"] Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.506168 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.512649 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eac9c0-22fc-4c42-93ab-0734f058a121-logs\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.512731 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mbx\" (UniqueName: \"kubernetes.io/projected/56eac9c0-22fc-4c42-93ab-0734f058a121-kube-api-access-r6mbx\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.512781 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-tls-certs\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.512904 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-config-data\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.512999 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-scripts\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.513113 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-secret-key\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.513176 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-combined-ca-bundle\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.516115 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57bfb4d496-nv6hv"] Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615119 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vblfp\" (UniqueName: \"kubernetes.io/projected/85281a42-f9ab-4302-9fe9-4e742075530f-kube-api-access-vblfp\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615172 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-config-data\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615211 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-horizon-secret-key\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615272 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85281a42-f9ab-4302-9fe9-4e742075530f-logs\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615302 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-scripts\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615338 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85281a42-f9ab-4302-9fe9-4e742075530f-config-data\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615356 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-secret-key\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615376 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-combined-ca-bundle\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615427 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eac9c0-22fc-4c42-93ab-0734f058a121-logs\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615546 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85281a42-f9ab-4302-9fe9-4e742075530f-scripts\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615605 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mbx\" (UniqueName: \"kubernetes.io/projected/56eac9c0-22fc-4c42-93ab-0734f058a121-kube-api-access-r6mbx\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615650 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-horizon-tls-certs\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615692 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-tls-certs\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615723 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eac9c0-22fc-4c42-93ab-0734f058a121-logs\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.615731 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-combined-ca-bundle\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.616281 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-scripts\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.618407 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-config-data\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.623190 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-secret-key\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.623639 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-tls-certs\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.623707 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-combined-ca-bundle\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.639839 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mbx\" (UniqueName: \"kubernetes.io/projected/56eac9c0-22fc-4c42-93ab-0734f058a121-kube-api-access-r6mbx\") pod \"horizon-57c7ff446b-7tmwn\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.709201 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.718460 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-horizon-secret-key\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.718653 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85281a42-f9ab-4302-9fe9-4e742075530f-logs\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.718790 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85281a42-f9ab-4302-9fe9-4e742075530f-config-data\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.718902 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85281a42-f9ab-4302-9fe9-4e742075530f-scripts\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.718974 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-horizon-tls-certs\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.719070 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-combined-ca-bundle\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.719166 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vblfp\" (UniqueName: \"kubernetes.io/projected/85281a42-f9ab-4302-9fe9-4e742075530f-kube-api-access-vblfp\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.719998 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85281a42-f9ab-4302-9fe9-4e742075530f-scripts\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.720339 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85281a42-f9ab-4302-9fe9-4e742075530f-logs\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.722821 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85281a42-f9ab-4302-9fe9-4e742075530f-config-data\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.723670 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-horizon-tls-certs\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.724518 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-combined-ca-bundle\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.729515 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85281a42-f9ab-4302-9fe9-4e742075530f-horizon-secret-key\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.746732 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vblfp\" (UniqueName: \"kubernetes.io/projected/85281a42-f9ab-4302-9fe9-4e742075530f-kube-api-access-vblfp\") pod \"horizon-57bfb4d496-nv6hv\" (UID: \"85281a42-f9ab-4302-9fe9-4e742075530f\") " pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:29 crc kubenswrapper[4574]: I1004 05:04:29.840945 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:31 crc kubenswrapper[4574]: I1004 05:04:31.079363 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:04:31 crc kubenswrapper[4574]: I1004 05:04:31.146777 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b7wnz"] Oct 04 05:04:31 crc kubenswrapper[4574]: I1004 05:04:31.146997 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-b7wnz" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="dnsmasq-dns" containerID="cri-o://4253713f3e4e44ab75aa65bad2554e74dab2beeae4379a99ce9fd4bb2c0e374c" gracePeriod=10 Oct 04 05:04:31 crc kubenswrapper[4574]: I1004 05:04:31.716960 4574 generic.go:334] "Generic (PLEG): container finished" podID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerID="4253713f3e4e44ab75aa65bad2554e74dab2beeae4379a99ce9fd4bb2c0e374c" exitCode=0 Oct 04 05:04:31 crc kubenswrapper[4574]: I1004 05:04:31.717035 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b7wnz" event={"ID":"618c6c39-89f8-45ee-b9df-753294b5cfeb","Type":"ContainerDied","Data":"4253713f3e4e44ab75aa65bad2554e74dab2beeae4379a99ce9fd4bb2c0e374c"} Oct 04 05:04:32 crc kubenswrapper[4574]: I1004 05:04:32.670315 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-b7wnz" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 04 05:04:34 crc kubenswrapper[4574]: I1004 05:04:34.917545 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.038575 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-combined-ca-bundle\") pod \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.038643 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99vck\" (UniqueName: \"kubernetes.io/projected/588695e8-2faf-4c61-bb2c-0caa5257f0eb-kube-api-access-99vck\") pod \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.038698 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-config-data\") pod \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.040002 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-scripts\") pod \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.040363 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-credential-keys\") pod \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.040452 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-fernet-keys\") pod \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\" (UID: \"588695e8-2faf-4c61-bb2c-0caa5257f0eb\") " Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.046132 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-scripts" (OuterVolumeSpecName: "scripts") pod "588695e8-2faf-4c61-bb2c-0caa5257f0eb" (UID: "588695e8-2faf-4c61-bb2c-0caa5257f0eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.046475 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588695e8-2faf-4c61-bb2c-0caa5257f0eb-kube-api-access-99vck" (OuterVolumeSpecName: "kube-api-access-99vck") pod "588695e8-2faf-4c61-bb2c-0caa5257f0eb" (UID: "588695e8-2faf-4c61-bb2c-0caa5257f0eb"). InnerVolumeSpecName "kube-api-access-99vck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.051969 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "588695e8-2faf-4c61-bb2c-0caa5257f0eb" (UID: "588695e8-2faf-4c61-bb2c-0caa5257f0eb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.052002 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "588695e8-2faf-4c61-bb2c-0caa5257f0eb" (UID: "588695e8-2faf-4c61-bb2c-0caa5257f0eb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.065750 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "588695e8-2faf-4c61-bb2c-0caa5257f0eb" (UID: "588695e8-2faf-4c61-bb2c-0caa5257f0eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.066993 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-config-data" (OuterVolumeSpecName: "config-data") pod "588695e8-2faf-4c61-bb2c-0caa5257f0eb" (UID: "588695e8-2faf-4c61-bb2c-0caa5257f0eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.142823 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.143062 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99vck\" (UniqueName: \"kubernetes.io/projected/588695e8-2faf-4c61-bb2c-0caa5257f0eb-kube-api-access-99vck\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.143170 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.143270 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.143353 4574 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.143441 4574 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/588695e8-2faf-4c61-bb2c-0caa5257f0eb-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.776728 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5m4r" event={"ID":"588695e8-2faf-4c61-bb2c-0caa5257f0eb","Type":"ContainerDied","Data":"898a8cf75869c9806b26bf70ed160cb1f71b783c3d31b45c5e4adc541e6622ba"} Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.776779 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898a8cf75869c9806b26bf70ed160cb1f71b783c3d31b45c5e4adc541e6622ba" Oct 04 05:04:35 crc kubenswrapper[4574]: I1004 05:04:35.776804 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5m4r" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.013911 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k5m4r"] Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.024967 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k5m4r"] Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.098870 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-75f5m"] Oct 04 05:04:36 crc kubenswrapper[4574]: E1004 05:04:36.099344 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588695e8-2faf-4c61-bb2c-0caa5257f0eb" containerName="keystone-bootstrap" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.099362 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="588695e8-2faf-4c61-bb2c-0caa5257f0eb" containerName="keystone-bootstrap" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.099599 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="588695e8-2faf-4c61-bb2c-0caa5257f0eb" containerName="keystone-bootstrap" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.100879 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.107913 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.113563 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-49xhf" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.113855 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.113886 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.121218 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-75f5m"] Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.267287 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjpb\" (UniqueName: \"kubernetes.io/projected/a3aa519a-57c4-46f2-8467-a4b85930eca7-kube-api-access-4xjpb\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.267362 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-scripts\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.267426 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-credential-keys\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.267467 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-combined-ca-bundle\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.267562 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-fernet-keys\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.267591 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-config-data\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.369862 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-scripts\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.370342 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-credential-keys\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.370509 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-combined-ca-bundle\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.370542 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-fernet-keys\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.370568 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-config-data\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.370656 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjpb\" (UniqueName: \"kubernetes.io/projected/a3aa519a-57c4-46f2-8467-a4b85930eca7-kube-api-access-4xjpb\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.375862 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-config-data\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.375963 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-credential-keys\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.376424 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-combined-ca-bundle\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.376569 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-fernet-keys\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.384425 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-scripts\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.392118 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjpb\" (UniqueName: \"kubernetes.io/projected/a3aa519a-57c4-46f2-8467-a4b85930eca7-kube-api-access-4xjpb\") pod \"keystone-bootstrap-75f5m\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.417817 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:04:36 crc kubenswrapper[4574]: I1004 05:04:36.742301 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588695e8-2faf-4c61-bb2c-0caa5257f0eb" path="/var/lib/kubelet/pods/588695e8-2faf-4c61-bb2c-0caa5257f0eb/volumes" Oct 04 05:04:37 crc kubenswrapper[4574]: E1004 05:04:37.786671 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 04 05:04:37 crc kubenswrapper[4574]: E1004 05:04:37.786873 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b5h5c4hd8h5bdh5d9h97hcdhc8h65bhb7h574h67fhc9h668hcfh56dh58fh67bh64bhd5h548h559hf4h4h659h649hdh57hd8h544h54fhb9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv8w6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-57f9bc9c8f-q72p7_openstack(6970e30d-161e-4b7f-bcff-81882edd065f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:04:37 crc kubenswrapper[4574]: I1004 05:04:37.944015 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.102548 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb7kd\" (UniqueName: \"kubernetes.io/projected/618c6c39-89f8-45ee-b9df-753294b5cfeb-kube-api-access-kb7kd\") pod \"618c6c39-89f8-45ee-b9df-753294b5cfeb\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.102611 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-sb\") pod \"618c6c39-89f8-45ee-b9df-753294b5cfeb\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.102646 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-nb\") pod \"618c6c39-89f8-45ee-b9df-753294b5cfeb\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.102696 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-dns-svc\") pod \"618c6c39-89f8-45ee-b9df-753294b5cfeb\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.102742 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-config\") pod \"618c6c39-89f8-45ee-b9df-753294b5cfeb\" (UID: \"618c6c39-89f8-45ee-b9df-753294b5cfeb\") " Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.116404 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618c6c39-89f8-45ee-b9df-753294b5cfeb-kube-api-access-kb7kd" (OuterVolumeSpecName: "kube-api-access-kb7kd") pod "618c6c39-89f8-45ee-b9df-753294b5cfeb" (UID: "618c6c39-89f8-45ee-b9df-753294b5cfeb"). InnerVolumeSpecName "kube-api-access-kb7kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.156790 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "618c6c39-89f8-45ee-b9df-753294b5cfeb" (UID: "618c6c39-89f8-45ee-b9df-753294b5cfeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.160875 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "618c6c39-89f8-45ee-b9df-753294b5cfeb" (UID: "618c6c39-89f8-45ee-b9df-753294b5cfeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.161918 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-config" (OuterVolumeSpecName: "config") pod "618c6c39-89f8-45ee-b9df-753294b5cfeb" (UID: "618c6c39-89f8-45ee-b9df-753294b5cfeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.166346 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "618c6c39-89f8-45ee-b9df-753294b5cfeb" (UID: "618c6c39-89f8-45ee-b9df-753294b5cfeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.205152 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb7kd\" (UniqueName: \"kubernetes.io/projected/618c6c39-89f8-45ee-b9df-753294b5cfeb-kube-api-access-kb7kd\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.205228 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.205255 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.205267 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.205277 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/618c6c39-89f8-45ee-b9df-753294b5cfeb-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.340297 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d026-account-create-lttsc"] Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.805118 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-b7wnz" event={"ID":"618c6c39-89f8-45ee-b9df-753294b5cfeb","Type":"ContainerDied","Data":"cdb2ac831153cbcaf370d6bee5406a4efbfdb6d5657a7b4f067a3204ad2cc76f"} Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.805193 4574 scope.go:117] "RemoveContainer" containerID="4253713f3e4e44ab75aa65bad2554e74dab2beeae4379a99ce9fd4bb2c0e374c" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.806061 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-b7wnz" Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.836032 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b7wnz"] Oct 04 05:04:38 crc kubenswrapper[4574]: I1004 05:04:38.844770 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-b7wnz"] Oct 04 05:04:39 crc kubenswrapper[4574]: E1004 05:04:39.063154 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 04 05:04:39 crc kubenswrapper[4574]: E1004 05:04:39.063373 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n555h94h5bfh88h5f4h676hbdhb7h64ch79h686h59fh58h55dh8hfbh59ch55ch5d7hd6h565h56chd6h654h67ch68fh649h79h677h674h589hbbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgb6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(f7cb1e07-7587-4b93-bf2f-a8229038b290): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:04:39 crc kubenswrapper[4574]: W1004 05:04:39.684439 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c8f6604_3234_44f9_8d5f_24c945ccc8ae.slice/crio-51b05374ae2e1d2a7d3b29a7e8aca65e3cb644dce509365ab999bfe2a077a835 WatchSource:0}: Error finding container 51b05374ae2e1d2a7d3b29a7e8aca65e3cb644dce509365ab999bfe2a077a835: Status 404 returned error can't find the container with id 51b05374ae2e1d2a7d3b29a7e8aca65e3cb644dce509365ab999bfe2a077a835 Oct 04 05:04:39 crc kubenswrapper[4574]: I1004 05:04:39.713019 4574 scope.go:117] "RemoveContainer" containerID="57ce78c2011e0774f4d945679ad0d85dfac8585f3033f9d235c9ac5112a3c7d8" Oct 04 05:04:39 crc kubenswrapper[4574]: I1004 05:04:39.823803 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d026-account-create-lttsc" event={"ID":"3c8f6604-3234-44f9-8d5f-24c945ccc8ae","Type":"ContainerStarted","Data":"51b05374ae2e1d2a7d3b29a7e8aca65e3cb644dce509365ab999bfe2a077a835"} Oct 04 05:04:40 crc kubenswrapper[4574]: E1004 05:04:40.235733 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-57f9bc9c8f-q72p7" podUID="6970e30d-161e-4b7f-bcff-81882edd065f" Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.306217 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57bfb4d496-nv6hv"] Oct 04 05:04:40 crc kubenswrapper[4574]: W1004 05:04:40.341721 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85281a42_f9ab_4302_9fe9_4e742075530f.slice/crio-d1a4f5d71f53bcf8cfcd82ef22d532f590dd9205681d25398f9ceefb1911abfd WatchSource:0}: Error finding container d1a4f5d71f53bcf8cfcd82ef22d532f590dd9205681d25398f9ceefb1911abfd: Status 404 returned error can't find the container with id d1a4f5d71f53bcf8cfcd82ef22d532f590dd9205681d25398f9ceefb1911abfd Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.358743 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-75f5m"] Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.374541 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57c7ff446b-7tmwn"] Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.559829 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-467d-account-create-vc9zc"] Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.756501 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" path="/var/lib/kubelet/pods/618c6c39-89f8-45ee-b9df-753294b5cfeb/volumes" Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.847561 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerStarted","Data":"98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.848944 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerStarted","Data":"ded04dd5820ce830aca3e3f2c4b13ba72639bd915330f9ef579a98c2f911b1ef"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.857877 4574 generic.go:334] "Generic (PLEG): container finished" podID="3c8f6604-3234-44f9-8d5f-24c945ccc8ae" containerID="462246a6f86b0e742a7f51ccc85daebd966441724a641eb8b101f460d59088ee" exitCode=0 Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.858488 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d026-account-create-lttsc" event={"ID":"3c8f6604-3234-44f9-8d5f-24c945ccc8ae","Type":"ContainerDied","Data":"462246a6f86b0e742a7f51ccc85daebd966441724a641eb8b101f460d59088ee"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.879824 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-467d-account-create-vc9zc" event={"ID":"782fe7ab-749f-434c-ba59-c7ab782dd007","Type":"ContainerStarted","Data":"92b1d0d275d5b7dc46e23090df66b86fad56051b33fafab1cce0e00156e0479b"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.888101 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8dc79ff-xkm7j" event={"ID":"271a7436-a272-479c-9473-decd7e54d73b","Type":"ContainerStarted","Data":"6c8e3e895c71ad05f17660149cd8100595fb33c8905bedd64db036d5bb7f5275"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.888284 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f8dc79ff-xkm7j" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon-log" containerID="cri-o://6c8e3e895c71ad05f17660149cd8100595fb33c8905bedd64db036d5bb7f5275" gracePeriod=30 Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.888949 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f8dc79ff-xkm7j" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon" containerID="cri-o://73df732f00cceeaa968b8349b02d63760fc46c1d13b94bffe3669b6ebb64e0eb" gracePeriod=30 Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.911154 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-75f5m" event={"ID":"a3aa519a-57c4-46f2-8467-a4b85930eca7","Type":"ContainerStarted","Data":"f7a619fa4e10a5aabdb57f766031e7fb3f2efe761fc03332f7708dc79aacbc5b"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.911493 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-75f5m" event={"ID":"a3aa519a-57c4-46f2-8467-a4b85930eca7","Type":"ContainerStarted","Data":"ad4a879a626179cc0f8ae86c3c5c8994d3812a41b51d3440f0b419fbb0a14130"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.918379 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerStarted","Data":"396e36ff745582a91328cd300eec656c3488a804e916a1ba203af21e483ddc03"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.918423 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerStarted","Data":"d1a4f5d71f53bcf8cfcd82ef22d532f590dd9205681d25398f9ceefb1911abfd"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.930969 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f8dc79ff-xkm7j" podStartSLOduration=2.816672521 podStartE2EDuration="18.930950223s" podCreationTimestamp="2025-10-04 05:04:22 +0000 UTC" firstStartedPulling="2025-10-04 05:04:23.767675848 +0000 UTC m=+1089.621818890" lastFinishedPulling="2025-10-04 05:04:39.88195355 +0000 UTC m=+1105.736096592" observedRunningTime="2025-10-04 05:04:40.92067636 +0000 UTC m=+1106.774819402" watchObservedRunningTime="2025-10-04 05:04:40.930950223 +0000 UTC m=+1106.785093275" Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.932032 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f9bc9c8f-q72p7" event={"ID":"6970e30d-161e-4b7f-bcff-81882edd065f","Type":"ContainerStarted","Data":"888cc9411958952c99824425ce6f7353fcf0b70c05beb7b64497d3c5a594891e"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.932194 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57f9bc9c8f-q72p7" podUID="6970e30d-161e-4b7f-bcff-81882edd065f" containerName="horizon" containerID="cri-o://888cc9411958952c99824425ce6f7353fcf0b70c05beb7b64497d3c5a594891e" gracePeriod=30 Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.936501 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.943845 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978f45d5c-g5q99" event={"ID":"de90c2bd-086a-4b9b-846e-048709c26ede","Type":"ContainerStarted","Data":"cf2e9b1aad4204f63631117978bcfe9485f1c15ac9c59c483c242881f4b0267d"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.943921 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978f45d5c-g5q99" event={"ID":"de90c2bd-086a-4b9b-846e-048709c26ede","Type":"ContainerStarted","Data":"6026ea67f9a1d6ffec9c77e732f9e627e7e2130e17ce502fb73629de3fbe4edf"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.944342 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6978f45d5c-g5q99" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon-log" containerID="cri-o://6026ea67f9a1d6ffec9c77e732f9e627e7e2130e17ce502fb73629de3fbe4edf" gracePeriod=30 Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.944916 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6978f45d5c-g5q99" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon" containerID="cri-o://cf2e9b1aad4204f63631117978bcfe9485f1c15ac9c59c483c242881f4b0267d" gracePeriod=30 Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.962313 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kt5kv" event={"ID":"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5","Type":"ContainerStarted","Data":"76b53efba941564fb7f377014e082a0e9e5fd3ce29a21fe59b6f149356f15e1c"} Oct 04 05:04:40 crc kubenswrapper[4574]: I1004 05:04:40.992144 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-75f5m" podStartSLOduration=4.992125114 podStartE2EDuration="4.992125114s" podCreationTimestamp="2025-10-04 05:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:40.963006275 +0000 UTC m=+1106.817149327" watchObservedRunningTime="2025-10-04 05:04:40.992125114 +0000 UTC m=+1106.846268156" Oct 04 05:04:41 crc kubenswrapper[4574]: I1004 05:04:41.021562 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6978f45d5c-g5q99" podStartSLOduration=3.856682779 podStartE2EDuration="22.021540622s" podCreationTimestamp="2025-10-04 05:04:19 +0000 UTC" firstStartedPulling="2025-10-04 05:04:21.520573192 +0000 UTC m=+1087.374716234" lastFinishedPulling="2025-10-04 05:04:39.685431035 +0000 UTC m=+1105.539574077" observedRunningTime="2025-10-04 05:04:40.992414212 +0000 UTC m=+1106.846557264" watchObservedRunningTime="2025-10-04 05:04:41.021540622 +0000 UTC m=+1106.875683664" Oct 04 05:04:41 crc kubenswrapper[4574]: I1004 05:04:41.052988 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-kt5kv" podStartSLOduration=3.057525001 podStartE2EDuration="21.052967176s" podCreationTimestamp="2025-10-04 05:04:20 +0000 UTC" firstStartedPulling="2025-10-04 05:04:21.689426124 +0000 UTC m=+1087.543569166" lastFinishedPulling="2025-10-04 05:04:39.684868299 +0000 UTC m=+1105.539011341" observedRunningTime="2025-10-04 05:04:41.039291127 +0000 UTC m=+1106.893434169" watchObservedRunningTime="2025-10-04 05:04:41.052967176 +0000 UTC m=+1106.907110218" Oct 04 05:04:41 crc kubenswrapper[4574]: I1004 05:04:41.974037 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnbbl" event={"ID":"59854ff7-fdcf-4a21-9fa6-9ab422be068e","Type":"ContainerStarted","Data":"18bd4c80932bd536c1168d16197471fcf990efe13d58a500aafc5596f21a6691"} Oct 04 05:04:41 crc kubenswrapper[4574]: I1004 05:04:41.987191 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerStarted","Data":"3e4dc4fc365b9ba947066873c9c3d152cb971dadf939b36d9d774912264c3816"} Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.015896 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerStarted","Data":"bafa808bdf2a35dee0e61ef90e7b8b4999e39f07d3cde96c5386527343a5b987"} Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.026376 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xnbbl" podStartSLOduration=4.181438617 podStartE2EDuration="49.026345898s" podCreationTimestamp="2025-10-04 05:03:53 +0000 UTC" firstStartedPulling="2025-10-04 05:03:54.901292144 +0000 UTC m=+1060.755435186" lastFinishedPulling="2025-10-04 05:04:39.746199425 +0000 UTC m=+1105.600342467" observedRunningTime="2025-10-04 05:04:42.011636739 +0000 UTC m=+1107.865779781" watchObservedRunningTime="2025-10-04 05:04:42.026345898 +0000 UTC m=+1107.880488940" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.027415 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-467d-account-create-vc9zc" event={"ID":"782fe7ab-749f-434c-ba59-c7ab782dd007","Type":"ContainerStarted","Data":"efdff84fbe8dce694f3912b9dbc26ef881dfefe4a9ad96f15344d202e244ccc2"} Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.034024 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8dc79ff-xkm7j" event={"ID":"271a7436-a272-479c-9473-decd7e54d73b","Type":"ContainerStarted","Data":"73df732f00cceeaa968b8349b02d63760fc46c1d13b94bffe3669b6ebb64e0eb"} Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.064310 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57c7ff446b-7tmwn" podStartSLOduration=13.064289078 podStartE2EDuration="13.064289078s" podCreationTimestamp="2025-10-04 05:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:42.053271014 +0000 UTC m=+1107.907414056" watchObservedRunningTime="2025-10-04 05:04:42.064289078 +0000 UTC m=+1107.918432120" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.111481 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57bfb4d496-nv6hv" podStartSLOduration=13.11145245 podStartE2EDuration="13.11145245s" podCreationTimestamp="2025-10-04 05:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:42.084664328 +0000 UTC m=+1107.938807370" watchObservedRunningTime="2025-10-04 05:04:42.11145245 +0000 UTC m=+1107.965595512" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.115866 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-467d-account-create-vc9zc" podStartSLOduration=16.115853476 podStartE2EDuration="16.115853476s" podCreationTimestamp="2025-10-04 05:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:04:42.104626596 +0000 UTC m=+1107.958769638" watchObservedRunningTime="2025-10-04 05:04:42.115853476 +0000 UTC m=+1107.969996528" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.374942 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.427414 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwlp5\" (UniqueName: \"kubernetes.io/projected/3c8f6604-3234-44f9-8d5f-24c945ccc8ae-kube-api-access-gwlp5\") pod \"3c8f6604-3234-44f9-8d5f-24c945ccc8ae\" (UID: \"3c8f6604-3234-44f9-8d5f-24c945ccc8ae\") " Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.445724 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8f6604-3234-44f9-8d5f-24c945ccc8ae-kube-api-access-gwlp5" (OuterVolumeSpecName: "kube-api-access-gwlp5") pod "3c8f6604-3234-44f9-8d5f-24c945ccc8ae" (UID: "3c8f6604-3234-44f9-8d5f-24c945ccc8ae"). InnerVolumeSpecName "kube-api-access-gwlp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.530194 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwlp5\" (UniqueName: \"kubernetes.io/projected/3c8f6604-3234-44f9-8d5f-24c945ccc8ae-kube-api-access-gwlp5\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:42 crc kubenswrapper[4574]: I1004 05:04:42.671179 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-b7wnz" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.044319 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d026-account-create-lttsc" event={"ID":"3c8f6604-3234-44f9-8d5f-24c945ccc8ae","Type":"ContainerDied","Data":"51b05374ae2e1d2a7d3b29a7e8aca65e3cb644dce509365ab999bfe2a077a835"} Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.044636 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b05374ae2e1d2a7d3b29a7e8aca65e3cb644dce509365ab999bfe2a077a835" Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.044409 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d026-account-create-lttsc" Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.046183 4574 generic.go:334] "Generic (PLEG): container finished" podID="782fe7ab-749f-434c-ba59-c7ab782dd007" containerID="efdff84fbe8dce694f3912b9dbc26ef881dfefe4a9ad96f15344d202e244ccc2" exitCode=0 Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.046280 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-467d-account-create-vc9zc" event={"ID":"782fe7ab-749f-434c-ba59-c7ab782dd007","Type":"ContainerDied","Data":"efdff84fbe8dce694f3912b9dbc26ef881dfefe4a9ad96f15344d202e244ccc2"} Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.050809 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerStarted","Data":"75815cc997bbe7a9c11dd0a85638a1fb8145706b94cadf0a9ee36554be1f1b29"} Oct 04 05:04:43 crc kubenswrapper[4574]: I1004 05:04:43.085517 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:04:44 crc kubenswrapper[4574]: I1004 05:04:44.365808 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:44 crc kubenswrapper[4574]: I1004 05:04:44.460056 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zn9g\" (UniqueName: \"kubernetes.io/projected/782fe7ab-749f-434c-ba59-c7ab782dd007-kube-api-access-5zn9g\") pod \"782fe7ab-749f-434c-ba59-c7ab782dd007\" (UID: \"782fe7ab-749f-434c-ba59-c7ab782dd007\") " Oct 04 05:04:44 crc kubenswrapper[4574]: I1004 05:04:44.467829 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782fe7ab-749f-434c-ba59-c7ab782dd007-kube-api-access-5zn9g" (OuterVolumeSpecName: "kube-api-access-5zn9g") pod "782fe7ab-749f-434c-ba59-c7ab782dd007" (UID: "782fe7ab-749f-434c-ba59-c7ab782dd007"). InnerVolumeSpecName "kube-api-access-5zn9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:04:44 crc kubenswrapper[4574]: I1004 05:04:44.564667 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zn9g\" (UniqueName: \"kubernetes.io/projected/782fe7ab-749f-434c-ba59-c7ab782dd007-kube-api-access-5zn9g\") on node \"crc\" DevicePath \"\"" Oct 04 05:04:45 crc kubenswrapper[4574]: I1004 05:04:45.068477 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-467d-account-create-vc9zc" event={"ID":"782fe7ab-749f-434c-ba59-c7ab782dd007","Type":"ContainerDied","Data":"92b1d0d275d5b7dc46e23090df66b86fad56051b33fafab1cce0e00156e0479b"} Oct 04 05:04:45 crc kubenswrapper[4574]: I1004 05:04:45.068691 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92b1d0d275d5b7dc46e23090df66b86fad56051b33fafab1cce0e00156e0479b" Oct 04 05:04:45 crc kubenswrapper[4574]: I1004 05:04:45.068529 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-467d-account-create-vc9zc" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.900355 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-khrbr"] Oct 04 05:04:46 crc kubenswrapper[4574]: E1004 05:04:46.901113 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8f6604-3234-44f9-8d5f-24c945ccc8ae" containerName="mariadb-account-create" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.901130 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8f6604-3234-44f9-8d5f-24c945ccc8ae" containerName="mariadb-account-create" Oct 04 05:04:46 crc kubenswrapper[4574]: E1004 05:04:46.901148 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782fe7ab-749f-434c-ba59-c7ab782dd007" containerName="mariadb-account-create" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.901156 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="782fe7ab-749f-434c-ba59-c7ab782dd007" containerName="mariadb-account-create" Oct 04 05:04:46 crc kubenswrapper[4574]: E1004 05:04:46.901181 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="dnsmasq-dns" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.901190 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="dnsmasq-dns" Oct 04 05:04:46 crc kubenswrapper[4574]: E1004 05:04:46.901207 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="init" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.901214 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="init" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.902912 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8f6604-3234-44f9-8d5f-24c945ccc8ae" containerName="mariadb-account-create" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.902939 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="782fe7ab-749f-434c-ba59-c7ab782dd007" containerName="mariadb-account-create" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.902961 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="618c6c39-89f8-45ee-b9df-753294b5cfeb" containerName="dnsmasq-dns" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.903631 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.907693 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zbcn2" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.907998 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.911001 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.916535 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-scripts\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.916602 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-db-sync-config-data\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.916633 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-combined-ca-bundle\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.916703 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxlrl\" (UniqueName: \"kubernetes.io/projected/9bd3ebd3-498c-4070-9de7-eab9d2866108-kube-api-access-lxlrl\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.916748 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-config-data\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.916824 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bd3ebd3-498c-4070-9de7-eab9d2866108-etc-machine-id\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:46 crc kubenswrapper[4574]: I1004 05:04:46.919846 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-khrbr"] Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.018453 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-scripts\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.018498 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-db-sync-config-data\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.019044 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-combined-ca-bundle\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.019129 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxlrl\" (UniqueName: \"kubernetes.io/projected/9bd3ebd3-498c-4070-9de7-eab9d2866108-kube-api-access-lxlrl\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.019182 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-config-data\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.019243 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bd3ebd3-498c-4070-9de7-eab9d2866108-etc-machine-id\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.019313 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bd3ebd3-498c-4070-9de7-eab9d2866108-etc-machine-id\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.025074 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-scripts\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.029342 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-db-sync-config-data\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.029869 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-combined-ca-bundle\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.030438 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-config-data\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.041881 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxlrl\" (UniqueName: \"kubernetes.io/projected/9bd3ebd3-498c-4070-9de7-eab9d2866108-kube-api-access-lxlrl\") pod \"cinder-db-sync-khrbr\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.119035 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pms9r"] Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.120443 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.124281 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.131312 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rmg8w" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.134263 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pms9r"] Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.226765 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-combined-ca-bundle\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.226818 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-db-sync-config-data\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.226919 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlsc8\" (UniqueName: \"kubernetes.io/projected/097cde22-53c8-44ef-90c9-7e7dd7c43609-kube-api-access-tlsc8\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.252302 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-khrbr" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.328844 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-combined-ca-bundle\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.328906 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-db-sync-config-data\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.328955 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlsc8\" (UniqueName: \"kubernetes.io/projected/097cde22-53c8-44ef-90c9-7e7dd7c43609-kube-api-access-tlsc8\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.338186 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-combined-ca-bundle\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.339842 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-db-sync-config-data\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.350706 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlsc8\" (UniqueName: \"kubernetes.io/projected/097cde22-53c8-44ef-90c9-7e7dd7c43609-kube-api-access-tlsc8\") pod \"barbican-db-sync-pms9r\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.452581 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pms9r" Oct 04 05:04:47 crc kubenswrapper[4574]: I1004 05:04:47.821924 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-khrbr"] Oct 04 05:04:47 crc kubenswrapper[4574]: W1004 05:04:47.833575 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bd3ebd3_498c_4070_9de7_eab9d2866108.slice/crio-dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd WatchSource:0}: Error finding container dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd: Status 404 returned error can't find the container with id dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd Oct 04 05:04:48 crc kubenswrapper[4574]: I1004 05:04:48.018470 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pms9r"] Oct 04 05:04:48 crc kubenswrapper[4574]: I1004 05:04:48.106941 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-khrbr" event={"ID":"9bd3ebd3-498c-4070-9de7-eab9d2866108","Type":"ContainerStarted","Data":"dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd"} Oct 04 05:04:48 crc kubenswrapper[4574]: I1004 05:04:48.108723 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pms9r" event={"ID":"097cde22-53c8-44ef-90c9-7e7dd7c43609","Type":"ContainerStarted","Data":"7050a9f24d331ea45ce562aa819b77af260a849aa1ac8ec6260cfcb4d3f7db88"} Oct 04 05:04:49 crc kubenswrapper[4574]: I1004 05:04:49.404535 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:04:49 crc kubenswrapper[4574]: I1004 05:04:49.405016 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:04:49 crc kubenswrapper[4574]: I1004 05:04:49.709978 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:49 crc kubenswrapper[4574]: I1004 05:04:49.710050 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:04:49 crc kubenswrapper[4574]: I1004 05:04:49.841029 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:49 crc kubenswrapper[4574]: I1004 05:04:49.842032 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:04:50 crc kubenswrapper[4574]: I1004 05:04:50.520917 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:04:51 crc kubenswrapper[4574]: I1004 05:04:51.148517 4574 generic.go:334] "Generic (PLEG): container finished" podID="9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" containerID="76b53efba941564fb7f377014e082a0e9e5fd3ce29a21fe59b6f149356f15e1c" exitCode=0 Oct 04 05:04:51 crc kubenswrapper[4574]: I1004 05:04:51.148567 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kt5kv" event={"ID":"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5","Type":"ContainerDied","Data":"76b53efba941564fb7f377014e082a0e9e5fd3ce29a21fe59b6f149356f15e1c"} Oct 04 05:04:52 crc kubenswrapper[4574]: I1004 05:04:52.182154 4574 generic.go:334] "Generic (PLEG): container finished" podID="a3aa519a-57c4-46f2-8467-a4b85930eca7" containerID="f7a619fa4e10a5aabdb57f766031e7fb3f2efe761fc03332f7708dc79aacbc5b" exitCode=0 Oct 04 05:04:52 crc kubenswrapper[4574]: I1004 05:04:52.184101 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-75f5m" event={"ID":"a3aa519a-57c4-46f2-8467-a4b85930eca7","Type":"ContainerDied","Data":"f7a619fa4e10a5aabdb57f766031e7fb3f2efe761fc03332f7708dc79aacbc5b"} Oct 04 05:04:59 crc kubenswrapper[4574]: I1004 05:04:59.712212 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:04:59 crc kubenswrapper[4574]: I1004 05:04:59.842602 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:04:59 crc kubenswrapper[4574]: I1004 05:04:59.933405 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kt5kv" Oct 04 05:04:59 crc kubenswrapper[4574]: I1004 05:04:59.940006 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089105 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-logs\") pod \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089191 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-combined-ca-bundle\") pod \"a3aa519a-57c4-46f2-8467-a4b85930eca7\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089265 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-config-data\") pod \"a3aa519a-57c4-46f2-8467-a4b85930eca7\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089306 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-fernet-keys\") pod \"a3aa519a-57c4-46f2-8467-a4b85930eca7\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089342 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-scripts\") pod \"a3aa519a-57c4-46f2-8467-a4b85930eca7\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089370 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-scripts\") pod \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089409 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29sqb\" (UniqueName: \"kubernetes.io/projected/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-kube-api-access-29sqb\") pod \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089451 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-config-data\") pod \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089477 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-credential-keys\") pod \"a3aa519a-57c4-46f2-8467-a4b85930eca7\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089504 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjpb\" (UniqueName: \"kubernetes.io/projected/a3aa519a-57c4-46f2-8467-a4b85930eca7-kube-api-access-4xjpb\") pod \"a3aa519a-57c4-46f2-8467-a4b85930eca7\" (UID: \"a3aa519a-57c4-46f2-8467-a4b85930eca7\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.089531 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-combined-ca-bundle\") pod \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\" (UID: \"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5\") " Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.100518 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-logs" (OuterVolumeSpecName: "logs") pod "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" (UID: "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.116404 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-scripts" (OuterVolumeSpecName: "scripts") pod "a3aa519a-57c4-46f2-8467-a4b85930eca7" (UID: "a3aa519a-57c4-46f2-8467-a4b85930eca7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.117673 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-scripts" (OuterVolumeSpecName: "scripts") pod "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" (UID: "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.120503 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a3aa519a-57c4-46f2-8467-a4b85930eca7" (UID: "a3aa519a-57c4-46f2-8467-a4b85930eca7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.128437 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-kube-api-access-29sqb" (OuterVolumeSpecName: "kube-api-access-29sqb") pod "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" (UID: "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5"). InnerVolumeSpecName "kube-api-access-29sqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.139462 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a3aa519a-57c4-46f2-8467-a4b85930eca7" (UID: "a3aa519a-57c4-46f2-8467-a4b85930eca7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.139546 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3aa519a-57c4-46f2-8467-a4b85930eca7-kube-api-access-4xjpb" (OuterVolumeSpecName: "kube-api-access-4xjpb") pod "a3aa519a-57c4-46f2-8467-a4b85930eca7" (UID: "a3aa519a-57c4-46f2-8467-a4b85930eca7"). InnerVolumeSpecName "kube-api-access-4xjpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.185574 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-config-data" (OuterVolumeSpecName: "config-data") pod "a3aa519a-57c4-46f2-8467-a4b85930eca7" (UID: "a3aa519a-57c4-46f2-8467-a4b85930eca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199430 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199475 4574 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199486 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199494 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199503 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29sqb\" (UniqueName: \"kubernetes.io/projected/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-kube-api-access-29sqb\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199513 4574 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199521 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjpb\" (UniqueName: \"kubernetes.io/projected/a3aa519a-57c4-46f2-8467-a4b85930eca7-kube-api-access-4xjpb\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.199529 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.202453 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" (UID: "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.212588 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-config-data" (OuterVolumeSpecName: "config-data") pod "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" (UID: "9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.262835 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kt5kv" event={"ID":"9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5","Type":"ContainerDied","Data":"cffa1a6d1d733ae188494049b46109ce8c87c2e9782ccaab6151690a209103f3"} Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.263007 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cffa1a6d1d733ae188494049b46109ce8c87c2e9782ccaab6151690a209103f3" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.263134 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kt5kv" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.271868 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-75f5m" event={"ID":"a3aa519a-57c4-46f2-8467-a4b85930eca7","Type":"ContainerDied","Data":"ad4a879a626179cc0f8ae86c3c5c8994d3812a41b51d3440f0b419fbb0a14130"} Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.271925 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4a879a626179cc0f8ae86c3c5c8994d3812a41b51d3440f0b419fbb0a14130" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.272027 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-75f5m" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.273674 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3aa519a-57c4-46f2-8467-a4b85930eca7" (UID: "a3aa519a-57c4-46f2-8467-a4b85930eca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.300805 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3aa519a-57c4-46f2-8467-a4b85930eca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.301031 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:00 crc kubenswrapper[4574]: I1004 05:05:00.301132 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.065198 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c85977bcb-np6n7"] Oct 04 05:05:01 crc kubenswrapper[4574]: E1004 05:05:01.065946 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3aa519a-57c4-46f2-8467-a4b85930eca7" containerName="keystone-bootstrap" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.065962 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3aa519a-57c4-46f2-8467-a4b85930eca7" containerName="keystone-bootstrap" Oct 04 05:05:01 crc kubenswrapper[4574]: E1004 05:05:01.065985 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" containerName="placement-db-sync" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.065992 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" containerName="placement-db-sync" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.066225 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3aa519a-57c4-46f2-8467-a4b85930eca7" containerName="keystone-bootstrap" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.066265 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" containerName="placement-db-sync" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.067407 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.069651 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.070521 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.070746 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.082130 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c85977bcb-np6n7"] Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.082994 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.083793 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z4lvd" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.163797 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-78786b8bfb-qgltl"] Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.168567 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.175146 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.186614 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.187014 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-49xhf" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.187190 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.187279 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.187769 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.193846 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78786b8bfb-qgltl"] Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.222941 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfpg\" (UniqueName: \"kubernetes.io/projected/462b910b-39e1-4a9e-a82c-3cfe77462a97-kube-api-access-kcfpg\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.223060 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462b910b-39e1-4a9e-a82c-3cfe77462a97-logs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.223092 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-internal-tls-certs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.223162 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-scripts\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.223253 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-public-tls-certs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.223299 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-config-data\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.223340 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-combined-ca-bundle\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325153 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-scripts\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325197 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-scripts\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325227 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-credential-keys\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325279 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-config-data\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325367 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-combined-ca-bundle\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325434 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-fernet-keys\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325463 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-public-tls-certs\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.325529 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-public-tls-certs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326004 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-config-data\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326041 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-combined-ca-bundle\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326084 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7fg\" (UniqueName: \"kubernetes.io/projected/1e4a50fe-8cee-4243-a215-9c82e358ea30-kube-api-access-dp7fg\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326158 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfpg\" (UniqueName: \"kubernetes.io/projected/462b910b-39e1-4a9e-a82c-3cfe77462a97-kube-api-access-kcfpg\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326187 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-internal-tls-certs\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326272 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462b910b-39e1-4a9e-a82c-3cfe77462a97-logs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.326316 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-internal-tls-certs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.330729 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462b910b-39e1-4a9e-a82c-3cfe77462a97-logs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.332784 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-config-data\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.333921 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-public-tls-certs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.338635 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-combined-ca-bundle\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.341105 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-internal-tls-certs\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.341487 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462b910b-39e1-4a9e-a82c-3cfe77462a97-scripts\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.358119 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfpg\" (UniqueName: \"kubernetes.io/projected/462b910b-39e1-4a9e-a82c-3cfe77462a97-kube-api-access-kcfpg\") pod \"placement-5c85977bcb-np6n7\" (UID: \"462b910b-39e1-4a9e-a82c-3cfe77462a97\") " pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.392627 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.427791 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7fg\" (UniqueName: \"kubernetes.io/projected/1e4a50fe-8cee-4243-a215-9c82e358ea30-kube-api-access-dp7fg\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.427884 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-internal-tls-certs\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.428895 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-scripts\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.428920 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-credential-keys\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.428948 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-config-data\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.428974 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-combined-ca-bundle\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.428990 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-fernet-keys\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.429012 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-public-tls-certs\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.432024 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-internal-tls-certs\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.437495 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-public-tls-certs\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.437698 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-combined-ca-bundle\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.447825 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-credential-keys\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.451763 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-fernet-keys\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.452389 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-config-data\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.453290 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e4a50fe-8cee-4243-a215-9c82e358ea30-scripts\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.482772 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7fg\" (UniqueName: \"kubernetes.io/projected/1e4a50fe-8cee-4243-a215-9c82e358ea30-kube-api-access-dp7fg\") pod \"keystone-78786b8bfb-qgltl\" (UID: \"1e4a50fe-8cee-4243-a215-9c82e358ea30\") " pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:01 crc kubenswrapper[4574]: I1004 05:05:01.493712 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:09 crc kubenswrapper[4574]: I1004 05:05:09.710882 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:05:09 crc kubenswrapper[4574]: I1004 05:05:09.841371 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.415527 4574 generic.go:334] "Generic (PLEG): container finished" podID="271a7436-a272-479c-9473-decd7e54d73b" containerID="73df732f00cceeaa968b8349b02d63760fc46c1d13b94bffe3669b6ebb64e0eb" exitCode=137 Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.415560 4574 generic.go:334] "Generic (PLEG): container finished" podID="271a7436-a272-479c-9473-decd7e54d73b" containerID="6c8e3e895c71ad05f17660149cd8100595fb33c8905bedd64db036d5bb7f5275" exitCode=137 Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.415602 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8dc79ff-xkm7j" event={"ID":"271a7436-a272-479c-9473-decd7e54d73b","Type":"ContainerDied","Data":"73df732f00cceeaa968b8349b02d63760fc46c1d13b94bffe3669b6ebb64e0eb"} Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.415627 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8dc79ff-xkm7j" event={"ID":"271a7436-a272-479c-9473-decd7e54d73b","Type":"ContainerDied","Data":"6c8e3e895c71ad05f17660149cd8100595fb33c8905bedd64db036d5bb7f5275"} Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.417089 4574 generic.go:334] "Generic (PLEG): container finished" podID="6970e30d-161e-4b7f-bcff-81882edd065f" containerID="888cc9411958952c99824425ce6f7353fcf0b70c05beb7b64497d3c5a594891e" exitCode=137 Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.417127 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f9bc9c8f-q72p7" event={"ID":"6970e30d-161e-4b7f-bcff-81882edd065f","Type":"ContainerDied","Data":"888cc9411958952c99824425ce6f7353fcf0b70c05beb7b64497d3c5a594891e"} Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.418796 4574 generic.go:334] "Generic (PLEG): container finished" podID="de90c2bd-086a-4b9b-846e-048709c26ede" containerID="cf2e9b1aad4204f63631117978bcfe9485f1c15ac9c59c483c242881f4b0267d" exitCode=137 Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.418817 4574 generic.go:334] "Generic (PLEG): container finished" podID="de90c2bd-086a-4b9b-846e-048709c26ede" containerID="6026ea67f9a1d6ffec9c77e732f9e627e7e2130e17ce502fb73629de3fbe4edf" exitCode=137 Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.418831 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978f45d5c-g5q99" event={"ID":"de90c2bd-086a-4b9b-846e-048709c26ede","Type":"ContainerDied","Data":"cf2e9b1aad4204f63631117978bcfe9485f1c15ac9c59c483c242881f4b0267d"} Oct 04 05:05:11 crc kubenswrapper[4574]: I1004 05:05:11.418848 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978f45d5c-g5q99" event={"ID":"de90c2bd-086a-4b9b-846e-048709c26ede","Type":"ContainerDied","Data":"6026ea67f9a1d6ffec9c77e732f9e627e7e2130e17ce502fb73629de3fbe4edf"} Oct 04 05:05:12 crc kubenswrapper[4574]: I1004 05:05:12.449374 4574 generic.go:334] "Generic (PLEG): container finished" podID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" containerID="18bd4c80932bd536c1168d16197471fcf990efe13d58a500aafc5596f21a6691" exitCode=0 Oct 04 05:05:12 crc kubenswrapper[4574]: I1004 05:05:12.449496 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnbbl" event={"ID":"59854ff7-fdcf-4a21-9fa6-9ab422be068e","Type":"ContainerDied","Data":"18bd4c80932bd536c1168d16197471fcf990efe13d58a500aafc5596f21a6691"} Oct 04 05:05:13 crc kubenswrapper[4574]: I1004 05:05:13.462607 4574 generic.go:334] "Generic (PLEG): container finished" podID="ad26bb6b-4342-4bfc-89b0-bb562b16af11" containerID="255ba195d8c5a3f6521995b6b5d51b8d8c42f900cf7009d5dee3896acc9b68fb" exitCode=0 Oct 04 05:05:13 crc kubenswrapper[4574]: I1004 05:05:13.462648 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8c9r7" event={"ID":"ad26bb6b-4342-4bfc-89b0-bb562b16af11","Type":"ContainerDied","Data":"255ba195d8c5a3f6521995b6b5d51b8d8c42f900cf7009d5dee3896acc9b68fb"} Oct 04 05:05:15 crc kubenswrapper[4574]: I1004 05:05:15.972364 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.133073 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t8qv\" (UniqueName: \"kubernetes.io/projected/ad26bb6b-4342-4bfc-89b0-bb562b16af11-kube-api-access-6t8qv\") pod \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.133254 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-config\") pod \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.133306 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-combined-ca-bundle\") pod \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\" (UID: \"ad26bb6b-4342-4bfc-89b0-bb562b16af11\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.148192 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad26bb6b-4342-4bfc-89b0-bb562b16af11-kube-api-access-6t8qv" (OuterVolumeSpecName: "kube-api-access-6t8qv") pod "ad26bb6b-4342-4bfc-89b0-bb562b16af11" (UID: "ad26bb6b-4342-4bfc-89b0-bb562b16af11"). InnerVolumeSpecName "kube-api-access-6t8qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.196802 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-config" (OuterVolumeSpecName: "config") pod "ad26bb6b-4342-4bfc-89b0-bb562b16af11" (UID: "ad26bb6b-4342-4bfc-89b0-bb562b16af11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.212394 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad26bb6b-4342-4bfc-89b0-bb562b16af11" (UID: "ad26bb6b-4342-4bfc-89b0-bb562b16af11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.235727 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.236017 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad26bb6b-4342-4bfc-89b0-bb562b16af11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.236030 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t8qv\" (UniqueName: \"kubernetes.io/projected/ad26bb6b-4342-4bfc-89b0-bb562b16af11-kube-api-access-6t8qv\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.492626 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8c9r7" event={"ID":"ad26bb6b-4342-4bfc-89b0-bb562b16af11","Type":"ContainerDied","Data":"deb142936eaac676139f54a39f8129dacd27c1869d329db42dcc9de3eb2fe886"} Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.492677 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb142936eaac676139f54a39f8129dacd27c1869d329db42dcc9de3eb2fe886" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.492762 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8c9r7" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.668008 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:05:16 crc kubenswrapper[4574]: E1004 05:05:16.672065 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 04 05:05:16 crc kubenswrapper[4574]: E1004 05:05:16.672282 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tlsc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-pms9r_openstack(097cde22-53c8-44ef-90c9-7e7dd7c43609): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:05:16 crc kubenswrapper[4574]: E1004 05:05:16.673611 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-pms9r" podUID="097cde22-53c8-44ef-90c9-7e7dd7c43609" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.678391 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.851833 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-config-data\") pod \"6970e30d-161e-4b7f-bcff-81882edd065f\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.851896 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-config-data\") pod \"271a7436-a272-479c-9473-decd7e54d73b\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852118 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv8w6\" (UniqueName: \"kubernetes.io/projected/6970e30d-161e-4b7f-bcff-81882edd065f-kube-api-access-mv8w6\") pod \"6970e30d-161e-4b7f-bcff-81882edd065f\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852173 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvw6z\" (UniqueName: \"kubernetes.io/projected/271a7436-a272-479c-9473-decd7e54d73b-kube-api-access-jvw6z\") pod \"271a7436-a272-479c-9473-decd7e54d73b\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852200 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6970e30d-161e-4b7f-bcff-81882edd065f-horizon-secret-key\") pod \"6970e30d-161e-4b7f-bcff-81882edd065f\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852255 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/271a7436-a272-479c-9473-decd7e54d73b-horizon-secret-key\") pod \"271a7436-a272-479c-9473-decd7e54d73b\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852375 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-scripts\") pod \"271a7436-a272-479c-9473-decd7e54d73b\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852435 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6970e30d-161e-4b7f-bcff-81882edd065f-logs\") pod \"6970e30d-161e-4b7f-bcff-81882edd065f\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852489 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-scripts\") pod \"6970e30d-161e-4b7f-bcff-81882edd065f\" (UID: \"6970e30d-161e-4b7f-bcff-81882edd065f\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.852522 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271a7436-a272-479c-9473-decd7e54d73b-logs\") pod \"271a7436-a272-479c-9473-decd7e54d73b\" (UID: \"271a7436-a272-479c-9473-decd7e54d73b\") " Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.855134 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6970e30d-161e-4b7f-bcff-81882edd065f-logs" (OuterVolumeSpecName: "logs") pod "6970e30d-161e-4b7f-bcff-81882edd065f" (UID: "6970e30d-161e-4b7f-bcff-81882edd065f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.856434 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6970e30d-161e-4b7f-bcff-81882edd065f-kube-api-access-mv8w6" (OuterVolumeSpecName: "kube-api-access-mv8w6") pod "6970e30d-161e-4b7f-bcff-81882edd065f" (UID: "6970e30d-161e-4b7f-bcff-81882edd065f"). InnerVolumeSpecName "kube-api-access-mv8w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.857867 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/271a7436-a272-479c-9473-decd7e54d73b-logs" (OuterVolumeSpecName: "logs") pod "271a7436-a272-479c-9473-decd7e54d73b" (UID: "271a7436-a272-479c-9473-decd7e54d73b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.874523 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271a7436-a272-479c-9473-decd7e54d73b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "271a7436-a272-479c-9473-decd7e54d73b" (UID: "271a7436-a272-479c-9473-decd7e54d73b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.874830 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271a7436-a272-479c-9473-decd7e54d73b-kube-api-access-jvw6z" (OuterVolumeSpecName: "kube-api-access-jvw6z") pod "271a7436-a272-479c-9473-decd7e54d73b" (UID: "271a7436-a272-479c-9473-decd7e54d73b"). InnerVolumeSpecName "kube-api-access-jvw6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.875868 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6970e30d-161e-4b7f-bcff-81882edd065f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6970e30d-161e-4b7f-bcff-81882edd065f" (UID: "6970e30d-161e-4b7f-bcff-81882edd065f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.885528 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-scripts" (OuterVolumeSpecName: "scripts") pod "6970e30d-161e-4b7f-bcff-81882edd065f" (UID: "6970e30d-161e-4b7f-bcff-81882edd065f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.897672 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-config-data" (OuterVolumeSpecName: "config-data") pod "271a7436-a272-479c-9473-decd7e54d73b" (UID: "271a7436-a272-479c-9473-decd7e54d73b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.914572 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-scripts" (OuterVolumeSpecName: "scripts") pod "271a7436-a272-479c-9473-decd7e54d73b" (UID: "271a7436-a272-479c-9473-decd7e54d73b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.915142 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-config-data" (OuterVolumeSpecName: "config-data") pod "6970e30d-161e-4b7f-bcff-81882edd065f" (UID: "6970e30d-161e-4b7f-bcff-81882edd065f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955397 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv8w6\" (UniqueName: \"kubernetes.io/projected/6970e30d-161e-4b7f-bcff-81882edd065f-kube-api-access-mv8w6\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955422 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvw6z\" (UniqueName: \"kubernetes.io/projected/271a7436-a272-479c-9473-decd7e54d73b-kube-api-access-jvw6z\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955432 4574 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6970e30d-161e-4b7f-bcff-81882edd065f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955441 4574 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/271a7436-a272-479c-9473-decd7e54d73b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955450 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955459 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6970e30d-161e-4b7f-bcff-81882edd065f-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955467 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955476 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/271a7436-a272-479c-9473-decd7e54d73b-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955483 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6970e30d-161e-4b7f-bcff-81882edd065f-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:16 crc kubenswrapper[4574]: I1004 05:05:16.955492 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/271a7436-a272-479c-9473-decd7e54d73b-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.154954 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65965d6475-d6mvn"] Oct 04 05:05:17 crc kubenswrapper[4574]: E1004 05:05:17.165593 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6970e30d-161e-4b7f-bcff-81882edd065f" containerName="horizon" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.165614 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6970e30d-161e-4b7f-bcff-81882edd065f" containerName="horizon" Oct 04 05:05:17 crc kubenswrapper[4574]: E1004 05:05:17.165669 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.165675 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon" Oct 04 05:05:17 crc kubenswrapper[4574]: E1004 05:05:17.165689 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad26bb6b-4342-4bfc-89b0-bb562b16af11" containerName="neutron-db-sync" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.165711 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad26bb6b-4342-4bfc-89b0-bb562b16af11" containerName="neutron-db-sync" Oct 04 05:05:17 crc kubenswrapper[4574]: E1004 05:05:17.165724 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon-log" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.165730 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon-log" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.165964 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="6970e30d-161e-4b7f-bcff-81882edd065f" containerName="horizon" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.165981 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon-log" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.166000 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="271a7436-a272-479c-9473-decd7e54d73b" containerName="horizon" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.166029 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad26bb6b-4342-4bfc-89b0-bb562b16af11" containerName="neutron-db-sync" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.167526 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.210523 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-d6mvn"] Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.263361 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.263433 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shbv4\" (UniqueName: \"kubernetes.io/projected/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-kube-api-access-shbv4\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.263499 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-svc\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.263541 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.263599 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.263639 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-config\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.322828 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fdd4f7798-vj9tl"] Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.327159 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.332170 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.332464 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7qmf7" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.333125 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.335171 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.354350 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fdd4f7798-vj9tl"] Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365277 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shbv4\" (UniqueName: \"kubernetes.io/projected/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-kube-api-access-shbv4\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365338 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-httpd-config\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365376 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-svc\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365398 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-config\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365413 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-ovndb-tls-certs\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365438 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365482 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365500 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxp9q\" (UniqueName: \"kubernetes.io/projected/cec43bff-ec9c-4c1f-975c-85c3292c3458-kube-api-access-zxp9q\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365525 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-combined-ca-bundle\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365566 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-config\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.365617 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.367367 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.367601 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-svc\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.367965 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.368129 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-config\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.369053 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.400974 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shbv4\" (UniqueName: \"kubernetes.io/projected/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-kube-api-access-shbv4\") pod \"dnsmasq-dns-65965d6475-d6mvn\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.468275 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-httpd-config\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.468352 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-config\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.468378 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-ovndb-tls-certs\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.468475 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxp9q\" (UniqueName: \"kubernetes.io/projected/cec43bff-ec9c-4c1f-975c-85c3292c3458-kube-api-access-zxp9q\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.468502 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-combined-ca-bundle\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.474199 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-combined-ca-bundle\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.474501 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-ovndb-tls-certs\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.477478 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-config\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.486608 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-httpd-config\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.487891 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxp9q\" (UniqueName: \"kubernetes.io/projected/cec43bff-ec9c-4c1f-975c-85c3292c3458-kube-api-access-zxp9q\") pod \"neutron-6fdd4f7798-vj9tl\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.489391 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.525749 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8dc79ff-xkm7j" event={"ID":"271a7436-a272-479c-9473-decd7e54d73b","Type":"ContainerDied","Data":"13bf89a0d72254c8ba0918da78b1bd4177a6d5a866f0edb9a3089e29a103c088"} Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.525811 4574 scope.go:117] "RemoveContainer" containerID="73df732f00cceeaa968b8349b02d63760fc46c1d13b94bffe3669b6ebb64e0eb" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.525963 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8dc79ff-xkm7j" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.534664 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f9bc9c8f-q72p7" event={"ID":"6970e30d-161e-4b7f-bcff-81882edd065f","Type":"ContainerDied","Data":"46e802ddcde6bc52cfcd123c20e1109784fa16931c281eb77982ca1315aac598"} Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.534677 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f9bc9c8f-q72p7" Oct 04 05:05:17 crc kubenswrapper[4574]: E1004 05:05:17.543104 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-pms9r" podUID="097cde22-53c8-44ef-90c9-7e7dd7c43609" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.608708 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57f9bc9c8f-q72p7"] Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.624362 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57f9bc9c8f-q72p7"] Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.660556 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.663437 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f8dc79ff-xkm7j"] Oct 04 05:05:17 crc kubenswrapper[4574]: I1004 05:05:17.673437 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f8dc79ff-xkm7j"] Oct 04 05:05:18 crc kubenswrapper[4574]: I1004 05:05:18.746485 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="271a7436-a272-479c-9473-decd7e54d73b" path="/var/lib/kubelet/pods/271a7436-a272-479c-9473-decd7e54d73b/volumes" Oct 04 05:05:18 crc kubenswrapper[4574]: I1004 05:05:18.748748 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6970e30d-161e-4b7f-bcff-81882edd065f" path="/var/lib/kubelet/pods/6970e30d-161e-4b7f-bcff-81882edd065f/volumes" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.405214 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.405306 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.405349 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.406077 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c021ed99dab79e0bc143879c96505d7aa34ab49c6d5b17fbf9b9b39bbe04b86"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.406134 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://0c021ed99dab79e0bc143879c96505d7aa34ab49c6d5b17fbf9b9b39bbe04b86" gracePeriod=600 Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.710954 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.711333 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.712042 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"3e4dc4fc365b9ba947066873c9c3d152cb971dadf939b36d9d774912264c3816"} pod="openstack/horizon-57c7ff446b-7tmwn" containerMessage="Container horizon failed startup probe, will be restarted" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.712087 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" containerID="cri-o://3e4dc4fc365b9ba947066873c9c3d152cb971dadf939b36d9d774912264c3816" gracePeriod=30 Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.840888 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.840967 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.841742 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"bafa808bdf2a35dee0e61ef90e7b8b4999e39f07d3cde96c5386527343a5b987"} pod="openstack/horizon-57bfb4d496-nv6hv" containerMessage="Container horizon failed startup probe, will be restarted" Oct 04 05:05:19 crc kubenswrapper[4574]: I1004 05:05:19.841781 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" containerID="cri-o://bafa808bdf2a35dee0e61ef90e7b8b4999e39f07d3cde96c5386527343a5b987" gracePeriod=30 Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.005358 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.005546 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxlrl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-khrbr_openstack(9bd3ebd3-498c-4070-9de7-eab9d2866108): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.006701 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-khrbr" podUID="9bd3ebd3-498c-4070-9de7-eab9d2866108" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.144805 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.155210 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnbbl" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.284559 4574 scope.go:117] "RemoveContainer" containerID="6c8e3e895c71ad05f17660149cd8100595fb33c8905bedd64db036d5bb7f5275" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.320076 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54746bc5fc-22pbj"] Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329012 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-config-data\") pod \"de90c2bd-086a-4b9b-846e-048709c26ede\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329087 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pm6\" (UniqueName: \"kubernetes.io/projected/59854ff7-fdcf-4a21-9fa6-9ab422be068e-kube-api-access-q8pm6\") pod \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329160 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-scripts\") pod \"de90c2bd-086a-4b9b-846e-048709c26ede\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329187 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-combined-ca-bundle\") pod \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329273 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de90c2bd-086a-4b9b-846e-048709c26ede-horizon-secret-key\") pod \"de90c2bd-086a-4b9b-846e-048709c26ede\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329322 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-config-data\") pod \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329393 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-db-sync-config-data\") pod \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\" (UID: \"59854ff7-fdcf-4a21-9fa6-9ab422be068e\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329423 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9zs7\" (UniqueName: \"kubernetes.io/projected/de90c2bd-086a-4b9b-846e-048709c26ede-kube-api-access-c9zs7\") pod \"de90c2bd-086a-4b9b-846e-048709c26ede\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.329455 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de90c2bd-086a-4b9b-846e-048709c26ede-logs\") pod \"de90c2bd-086a-4b9b-846e-048709c26ede\" (UID: \"de90c2bd-086a-4b9b-846e-048709c26ede\") " Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.330946 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.330967 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon" Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.330977 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon-log" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.330984 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon-log" Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.331025 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" containerName="glance-db-sync" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.331031 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" containerName="glance-db-sync" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.331288 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" containerName="glance-db-sync" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.331307 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon-log" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.331321 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" containerName="horizon" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.332190 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54746bc5fc-22pbj"] Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.332288 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.337368 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59854ff7-fdcf-4a21-9fa6-9ab422be068e-kube-api-access-q8pm6" (OuterVolumeSpecName: "kube-api-access-q8pm6") pod "59854ff7-fdcf-4a21-9fa6-9ab422be068e" (UID: "59854ff7-fdcf-4a21-9fa6-9ab422be068e"). InnerVolumeSpecName "kube-api-access-q8pm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.339821 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.340073 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.345412 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de90c2bd-086a-4b9b-846e-048709c26ede-logs" (OuterVolumeSpecName: "logs") pod "de90c2bd-086a-4b9b-846e-048709c26ede" (UID: "de90c2bd-086a-4b9b-846e-048709c26ede"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.354371 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "59854ff7-fdcf-4a21-9fa6-9ab422be068e" (UID: "59854ff7-fdcf-4a21-9fa6-9ab422be068e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.370464 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de90c2bd-086a-4b9b-846e-048709c26ede-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "de90c2bd-086a-4b9b-846e-048709c26ede" (UID: "de90c2bd-086a-4b9b-846e-048709c26ede"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.375446 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de90c2bd-086a-4b9b-846e-048709c26ede-kube-api-access-c9zs7" (OuterVolumeSpecName: "kube-api-access-c9zs7") pod "de90c2bd-086a-4b9b-846e-048709c26ede" (UID: "de90c2bd-086a-4b9b-846e-048709c26ede"). InnerVolumeSpecName "kube-api-access-c9zs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.448814 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-public-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.448868 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4428\" (UniqueName: \"kubernetes.io/projected/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-kube-api-access-s4428\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449033 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-internal-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449078 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-ovndb-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449203 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-config\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449229 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-httpd-config\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449356 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-combined-ca-bundle\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449428 4574 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de90c2bd-086a-4b9b-846e-048709c26ede-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449444 4574 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449458 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9zs7\" (UniqueName: \"kubernetes.io/projected/de90c2bd-086a-4b9b-846e-048709c26ede-kube-api-access-c9zs7\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449473 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de90c2bd-086a-4b9b-846e-048709c26ede-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449487 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8pm6\" (UniqueName: \"kubernetes.io/projected/59854ff7-fdcf-4a21-9fa6-9ab422be068e-kube-api-access-q8pm6\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.449955 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-config-data" (OuterVolumeSpecName: "config-data") pod "de90c2bd-086a-4b9b-846e-048709c26ede" (UID: "de90c2bd-086a-4b9b-846e-048709c26ede"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.452018 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-scripts" (OuterVolumeSpecName: "scripts") pod "de90c2bd-086a-4b9b-846e-048709c26ede" (UID: "de90c2bd-086a-4b9b-846e-048709c26ede"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.475122 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59854ff7-fdcf-4a21-9fa6-9ab422be068e" (UID: "59854ff7-fdcf-4a21-9fa6-9ab422be068e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.510465 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-config-data" (OuterVolumeSpecName: "config-data") pod "59854ff7-fdcf-4a21-9fa6-9ab422be068e" (UID: "59854ff7-fdcf-4a21-9fa6-9ab422be068e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.521826 4574 scope.go:117] "RemoveContainer" containerID="888cc9411958952c99824425ce6f7353fcf0b70c05beb7b64497d3c5a594891e" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556467 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-config\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556518 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-httpd-config\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556580 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-combined-ca-bundle\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556630 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-public-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556657 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4428\" (UniqueName: \"kubernetes.io/projected/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-kube-api-access-s4428\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556730 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-internal-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556758 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-ovndb-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556820 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556836 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556848 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de90c2bd-086a-4b9b-846e-048709c26ede-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.556859 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59854ff7-fdcf-4a21-9fa6-9ab422be068e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.561732 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-config\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.565742 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-ovndb-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.567084 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-internal-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.569067 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-public-tls-certs\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.570012 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-httpd-config\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.571431 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-combined-ca-bundle\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.583254 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4428\" (UniqueName: \"kubernetes.io/projected/e736cc6e-edb6-4fad-8687-6c4e2a85d0a0-kube-api-access-s4428\") pod \"neutron-54746bc5fc-22pbj\" (UID: \"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0\") " pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.614147 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="0c021ed99dab79e0bc143879c96505d7aa34ab49c6d5b17fbf9b9b39bbe04b86" exitCode=0 Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.614217 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"0c021ed99dab79e0bc143879c96505d7aa34ab49c6d5b17fbf9b9b39bbe04b86"} Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.616169 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6978f45d5c-g5q99" event={"ID":"de90c2bd-086a-4b9b-846e-048709c26ede","Type":"ContainerDied","Data":"886b7ef249256ec7aedaf92b6fd1c5b0540225fd1efe8362900b42e4911cc236"} Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.616258 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6978f45d5c-g5q99" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.640432 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnbbl" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.640517 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnbbl" event={"ID":"59854ff7-fdcf-4a21-9fa6-9ab422be068e","Type":"ContainerDied","Data":"3348d72c9eceb452ec55dff079a9f2b9778ca07cb8dbdaf716989e3aa26349a6"} Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.640583 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3348d72c9eceb452ec55dff079a9f2b9778ca07cb8dbdaf716989e3aa26349a6" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.666729 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c85977bcb-np6n7"] Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.740481 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.809826 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6978f45d5c-g5q99"] Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.809870 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6978f45d5c-g5q99"] Oct 04 05:05:20 crc kubenswrapper[4574]: I1004 05:05:20.893657 4574 scope.go:117] "RemoveContainer" containerID="a99234efe67f037290baa95758d3a1f0d549bea91113058aaa5fd090767eb42e" Oct 04 05:05:20 crc kubenswrapper[4574]: E1004 05:05:20.894449 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-khrbr" podUID="9bd3ebd3-498c-4070-9de7-eab9d2866108" Oct 04 05:05:20 crc kubenswrapper[4574]: W1004 05:05:20.913941 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod462b910b_39e1_4a9e_a82c_3cfe77462a97.slice/crio-e2f36a5bb63ac20944f4876fa1a010bb53eab3297f7a6af6d9a9123c9dd65c31 WatchSource:0}: Error finding container e2f36a5bb63ac20944f4876fa1a010bb53eab3297f7a6af6d9a9123c9dd65c31: Status 404 returned error can't find the container with id e2f36a5bb63ac20944f4876fa1a010bb53eab3297f7a6af6d9a9123c9dd65c31 Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.159948 4574 scope.go:117] "RemoveContainer" containerID="cf2e9b1aad4204f63631117978bcfe9485f1c15ac9c59c483c242881f4b0267d" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.232045 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78786b8bfb-qgltl"] Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.233991 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-d6mvn"] Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.274560 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fdd4f7798-vj9tl"] Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.430555 4574 scope.go:117] "RemoveContainer" containerID="6026ea67f9a1d6ffec9c77e732f9e627e7e2130e17ce502fb73629de3fbe4edf" Oct 04 05:05:21 crc kubenswrapper[4574]: W1004 05:05:21.433387 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcec43bff_ec9c_4c1f_975c_85c3292c3458.slice/crio-2779e3e30c1951ca8b280e1f7f039da0646b0d1ee198fa4e77cb8170a577876f WatchSource:0}: Error finding container 2779e3e30c1951ca8b280e1f7f039da0646b0d1ee198fa4e77cb8170a577876f: Status 404 returned error can't find the container with id 2779e3e30c1951ca8b280e1f7f039da0646b0d1ee198fa4e77cb8170a577876f Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.649118 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-d6mvn"] Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.766975 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c85977bcb-np6n7" event={"ID":"462b910b-39e1-4a9e-a82c-3cfe77462a97","Type":"ContainerStarted","Data":"9bbdab35af7a7b884abb6bbec20aff1528d7d25f08f91d06476308062aac7f05"} Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.767028 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c85977bcb-np6n7" event={"ID":"462b910b-39e1-4a9e-a82c-3cfe77462a97","Type":"ContainerStarted","Data":"e2f36a5bb63ac20944f4876fa1a010bb53eab3297f7a6af6d9a9123c9dd65c31"} Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.791062 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" event={"ID":"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310","Type":"ContainerStarted","Data":"6354e3fbbd9509df75f447ad484b0d175bbe2969b8a4f0fc5893f8f26924f124"} Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.795272 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rqc9t"] Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.826190 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.884761 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78786b8bfb-qgltl" event={"ID":"1e4a50fe-8cee-4243-a215-9c82e358ea30","Type":"ContainerStarted","Data":"6c6c2ba2583a88bb3ac4d3dabdc49fc1bd3f5398fba3dd10085e203680112a5b"} Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.922512 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fdd4f7798-vj9tl" event={"ID":"cec43bff-ec9c-4c1f-975c-85c3292c3458","Type":"ContainerStarted","Data":"2779e3e30c1951ca8b280e1f7f039da0646b0d1ee198fa4e77cb8170a577876f"} Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.940564 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rqc9t"] Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.973263 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-config\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.973368 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.973411 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwsnv\" (UniqueName: \"kubernetes.io/projected/e598d7f1-865d-47bf-9263-ca027b3c92c9-kube-api-access-hwsnv\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.973439 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.973464 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:21 crc kubenswrapper[4574]: I1004 05:05:21.973493 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.000885 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerStarted","Data":"6bd0bbfe123436007d6ff33e2ea6122d521f5fc915f66e78b8f202915a0e0e32"} Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.037218 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"422e81dba527fdce5fb46863c657f7a61bb4d0e601b192c209383ffeaf65198f"} Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.075219 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-config\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.075311 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.075370 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwsnv\" (UniqueName: \"kubernetes.io/projected/e598d7f1-865d-47bf-9263-ca027b3c92c9-kube-api-access-hwsnv\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.075396 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.075417 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.075450 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.076386 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.076944 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-config\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.077759 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.078140 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.083971 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.122099 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54746bc5fc-22pbj"] Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.146775 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwsnv\" (UniqueName: \"kubernetes.io/projected/e598d7f1-865d-47bf-9263-ca027b3c92c9-kube-api-access-hwsnv\") pod \"dnsmasq-dns-84b966f6c9-rqc9t\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.243204 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:22 crc kubenswrapper[4574]: E1004 05:05:22.541397 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a2e55c6_3f30_4c4d_a8dd_6952b97e4310.slice/crio-conmon-4136d35072d2fb75de1439c94f6b3545145cf37e42bd6c4f8dad7775112ab0ad.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.771980 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de90c2bd-086a-4b9b-846e-048709c26ede" path="/var/lib/kubelet/pods/de90c2bd-086a-4b9b-846e-048709c26ede/volumes" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.871210 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.872846 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.875108 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zkflm" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.875429 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.883089 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.899895 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.984064 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rqc9t"] Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994509 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994576 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994614 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-scripts\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994671 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nd94\" (UniqueName: \"kubernetes.io/projected/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-kube-api-access-7nd94\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994702 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-config-data\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994726 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-logs\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:22 crc kubenswrapper[4574]: I1004 05:05:22.994772 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.055022 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54746bc5fc-22pbj" event={"ID":"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0","Type":"ContainerStarted","Data":"df6d62ca808f3643310994a46e401c09cd467445433aafc8a01b23012554b635"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.055072 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54746bc5fc-22pbj" event={"ID":"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0","Type":"ContainerStarted","Data":"9bb61f330af61ac7efa99de84f59dc75fc421339b31c2fab723011df6b141cb3"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.066671 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78786b8bfb-qgltl" event={"ID":"1e4a50fe-8cee-4243-a215-9c82e358ea30","Type":"ContainerStarted","Data":"21a9255fc8e7c0258cf261dcc732765421d3d90687ad0455d947d3594e6b3986"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.067990 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.072117 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fdd4f7798-vj9tl" event={"ID":"cec43bff-ec9c-4c1f-975c-85c3292c3458","Type":"ContainerStarted","Data":"655dae3bd54eaa26e8f66741694c4f05a6d40744481378a1a8f28f6c7e36ea08"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.076933 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" event={"ID":"e598d7f1-865d-47bf-9263-ca027b3c92c9","Type":"ContainerStarted","Data":"d724b7374a7b2bc0a76bb605c1a53317c993cf374bec222e7f0d8c61d16d4a97"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.094522 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-78786b8bfb-qgltl" podStartSLOduration=22.094497936 podStartE2EDuration="22.094497936s" podCreationTimestamp="2025-10-04 05:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:23.090807901 +0000 UTC m=+1148.944950953" watchObservedRunningTime="2025-10-04 05:05:23.094497936 +0000 UTC m=+1148.948640978" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.095578 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c85977bcb-np6n7" event={"ID":"462b910b-39e1-4a9e-a82c-3cfe77462a97","Type":"ContainerStarted","Data":"2f296fbda62e700df76af82af8f1938c053e93ce9b7686ccefbddcd873645f9c"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.096139 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.096196 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-scripts\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.096275 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nd94\" (UniqueName: \"kubernetes.io/projected/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-kube-api-access-7nd94\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.096452 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.098158 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.098079 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-config-data\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.098316 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-logs\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.098679 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-logs\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.096797 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.098846 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.099107 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.109625 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-scripts\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.111735 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.113090 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-config-data\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.124073 4574 generic.go:334] "Generic (PLEG): container finished" podID="7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" containerID="4136d35072d2fb75de1439c94f6b3545145cf37e42bd6c4f8dad7775112ab0ad" exitCode=0 Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.124103 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" event={"ID":"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310","Type":"ContainerDied","Data":"4136d35072d2fb75de1439c94f6b3545145cf37e42bd6c4f8dad7775112ab0ad"} Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.125518 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.129177 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nd94\" (UniqueName: \"kubernetes.io/projected/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-kube-api-access-7nd94\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.202916 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5c85977bcb-np6n7" podStartSLOduration=22.202898182 podStartE2EDuration="22.202898182s" podCreationTimestamp="2025-10-04 05:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:23.137987264 +0000 UTC m=+1148.992130316" watchObservedRunningTime="2025-10-04 05:05:23.202898182 +0000 UTC m=+1149.057041224" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.203354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.218081 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.241644 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.248400 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.250705 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.312111 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.317688 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.317736 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.317780 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.320805 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2sk\" (UniqueName: \"kubernetes.io/projected/9be13b58-cf10-4e6e-ae12-1b93ab475796-kube-api-access-2h2sk\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.321641 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-logs\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.321716 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.322000 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425314 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425380 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425400 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425421 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425468 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2sk\" (UniqueName: \"kubernetes.io/projected/9be13b58-cf10-4e6e-ae12-1b93ab475796-kube-api-access-2h2sk\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425493 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-logs\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.425517 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.426275 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.427921 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.435652 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.435987 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-logs\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.447354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.448255 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2sk\" (UniqueName: \"kubernetes.io/projected/9be13b58-cf10-4e6e-ae12-1b93ab475796-kube-api-access-2h2sk\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.451567 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.524078 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.653302 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:23 crc kubenswrapper[4574]: I1004 05:05:23.839118 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.044091 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-config\") pod \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.044432 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-svc\") pod \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.044506 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-swift-storage-0\") pod \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.044560 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-nb\") pod \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.044673 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-sb\") pod \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.044698 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shbv4\" (UniqueName: \"kubernetes.io/projected/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-kube-api-access-shbv4\") pod \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\" (UID: \"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310\") " Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.051873 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-kube-api-access-shbv4" (OuterVolumeSpecName: "kube-api-access-shbv4") pod "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" (UID: "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310"). InnerVolumeSpecName "kube-api-access-shbv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.080213 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" (UID: "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.091727 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" (UID: "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.152625 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-config" (OuterVolumeSpecName: "config") pod "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" (UID: "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.158310 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" (UID: "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.162495 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shbv4\" (UniqueName: \"kubernetes.io/projected/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-kube-api-access-shbv4\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.162536 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.162548 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.162559 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.162572 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.172540 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" (UID: "7a2e55c6-3f30-4c4d-a8dd-6952b97e4310"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.246001 4574 generic.go:334] "Generic (PLEG): container finished" podID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerID="91613fe8fee7919f3cfff79db0d7ce3e20fc9185cede4edd77893dceaf30cec0" exitCode=0 Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.246081 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" event={"ID":"e598d7f1-865d-47bf-9263-ca027b3c92c9","Type":"ContainerDied","Data":"91613fe8fee7919f3cfff79db0d7ce3e20fc9185cede4edd77893dceaf30cec0"} Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.266089 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.304578 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" event={"ID":"7a2e55c6-3f30-4c4d-a8dd-6952b97e4310","Type":"ContainerDied","Data":"6354e3fbbd9509df75f447ad484b0d175bbe2969b8a4f0fc5893f8f26924f124"} Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.304631 4574 scope.go:117] "RemoveContainer" containerID="4136d35072d2fb75de1439c94f6b3545145cf37e42bd6c4f8dad7775112ab0ad" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.304910 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-d6mvn" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.332517 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54746bc5fc-22pbj" event={"ID":"e736cc6e-edb6-4fad-8687-6c4e2a85d0a0","Type":"ContainerStarted","Data":"6aa548c32d69024131ba62be21e8aaa715b25b6a1a48d6654b30244add7b3bba"} Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.334002 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.382539 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.385545 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54746bc5fc-22pbj" podStartSLOduration=4.385522349 podStartE2EDuration="4.385522349s" podCreationTimestamp="2025-10-04 05:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:24.357945154 +0000 UTC m=+1150.212088196" watchObservedRunningTime="2025-10-04 05:05:24.385522349 +0000 UTC m=+1150.239665391" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.397281 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fdd4f7798-vj9tl" event={"ID":"cec43bff-ec9c-4c1f-975c-85c3292c3458","Type":"ContainerStarted","Data":"67d119404faa86d73dc38c89ef684141fecea4762f7fa63a1580749cea1c68c2"} Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.400161 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.443876 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-d6mvn"] Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.455178 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-d6mvn"] Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.467324 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fdd4f7798-vj9tl" podStartSLOduration=7.467295457 podStartE2EDuration="7.467295457s" podCreationTimestamp="2025-10-04 05:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:24.454306947 +0000 UTC m=+1150.308449989" watchObservedRunningTime="2025-10-04 05:05:24.467295457 +0000 UTC m=+1150.321438499" Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.573643 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:24 crc kubenswrapper[4574]: W1004 05:05:24.593922 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9be13b58_cf10_4e6e_ae12_1b93ab475796.slice/crio-f068a5a7e508b3d0e17938f739ceb1ceee9ccd30f585d3b0955f06363b4c2f46 WatchSource:0}: Error finding container f068a5a7e508b3d0e17938f739ceb1ceee9ccd30f585d3b0955f06363b4c2f46: Status 404 returned error can't find the container with id f068a5a7e508b3d0e17938f739ceb1ceee9ccd30f585d3b0955f06363b4c2f46 Oct 04 05:05:24 crc kubenswrapper[4574]: I1004 05:05:24.747407 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" path="/var/lib/kubelet/pods/7a2e55c6-3f30-4c4d-a8dd-6952b97e4310/volumes" Oct 04 05:05:25 crc kubenswrapper[4574]: I1004 05:05:25.415413 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9be13b58-cf10-4e6e-ae12-1b93ab475796","Type":"ContainerStarted","Data":"f068a5a7e508b3d0e17938f739ceb1ceee9ccd30f585d3b0955f06363b4c2f46"} Oct 04 05:05:25 crc kubenswrapper[4574]: I1004 05:05:25.417898 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c","Type":"ContainerStarted","Data":"9ac5d2d728f73e7b024f118d0343f5c3061b006a2d3b7c55965dfdb5aa1631a7"} Oct 04 05:05:25 crc kubenswrapper[4574]: I1004 05:05:25.423816 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" event={"ID":"e598d7f1-865d-47bf-9263-ca027b3c92c9","Type":"ContainerStarted","Data":"73c876e0484d654a09157e18cac7986798e6b0b7d8859389c8ddb9f3bcb29186"} Oct 04 05:05:25 crc kubenswrapper[4574]: I1004 05:05:25.424889 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:25 crc kubenswrapper[4574]: I1004 05:05:25.447137 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" podStartSLOduration=4.447119542 podStartE2EDuration="4.447119542s" podCreationTimestamp="2025-10-04 05:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:25.441962715 +0000 UTC m=+1151.296105767" watchObservedRunningTime="2025-10-04 05:05:25.447119542 +0000 UTC m=+1151.301262584" Oct 04 05:05:25 crc kubenswrapper[4574]: I1004 05:05:25.954507 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:26 crc kubenswrapper[4574]: I1004 05:05:26.076874 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:28 crc kubenswrapper[4574]: I1004 05:05:28.464817 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9be13b58-cf10-4e6e-ae12-1b93ab475796","Type":"ContainerStarted","Data":"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae"} Oct 04 05:05:28 crc kubenswrapper[4574]: I1004 05:05:28.472513 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c","Type":"ContainerStarted","Data":"732562073d51162f1ddb02b47ca02a27ebfda0eab4772fc38f7c564ba28e93ee"} Oct 04 05:05:32 crc kubenswrapper[4574]: I1004 05:05:32.245872 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:05:32 crc kubenswrapper[4574]: I1004 05:05:32.334808 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-wfxkf"] Oct 04 05:05:32 crc kubenswrapper[4574]: I1004 05:05:32.335090 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerName="dnsmasq-dns" containerID="cri-o://9996ddb2a7c43991e68612cd13d2b5b7095f1eb04d31dbac97dfb32af1bee999" gracePeriod=10 Oct 04 05:05:33 crc kubenswrapper[4574]: I1004 05:05:33.532439 4574 generic.go:334] "Generic (PLEG): container finished" podID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerID="9996ddb2a7c43991e68612cd13d2b5b7095f1eb04d31dbac97dfb32af1bee999" exitCode=0 Oct 04 05:05:33 crc kubenswrapper[4574]: I1004 05:05:33.532793 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" event={"ID":"592f8aa2-58a2-4c6d-b0d5-25c688ccf382","Type":"ContainerDied","Data":"9996ddb2a7c43991e68612cd13d2b5b7095f1eb04d31dbac97dfb32af1bee999"} Oct 04 05:05:34 crc kubenswrapper[4574]: I1004 05:05:34.928992 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:34 crc kubenswrapper[4574]: I1004 05:05:34.946115 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5c85977bcb-np6n7" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.571161 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" event={"ID":"592f8aa2-58a2-4c6d-b0d5-25c688ccf382","Type":"ContainerDied","Data":"74d6959e80b98a1aa80d9d23aa27691f3bacd49e86a39f83b6cf25f86677ff91"} Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.571480 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d6959e80b98a1aa80d9d23aa27691f3bacd49e86a39f83b6cf25f86677ff91" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.656361 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.723834 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-svc\") pod \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.723929 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxnd6\" (UniqueName: \"kubernetes.io/projected/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-kube-api-access-kxnd6\") pod \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.723996 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-nb\") pod \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.724080 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-config\") pod \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.724151 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-sb\") pod \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.724187 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-swift-storage-0\") pod \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\" (UID: \"592f8aa2-58a2-4c6d-b0d5-25c688ccf382\") " Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.768309 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-kube-api-access-kxnd6" (OuterVolumeSpecName: "kube-api-access-kxnd6") pod "592f8aa2-58a2-4c6d-b0d5-25c688ccf382" (UID: "592f8aa2-58a2-4c6d-b0d5-25c688ccf382"). InnerVolumeSpecName "kube-api-access-kxnd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.814301 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-config" (OuterVolumeSpecName: "config") pod "592f8aa2-58a2-4c6d-b0d5-25c688ccf382" (UID: "592f8aa2-58a2-4c6d-b0d5-25c688ccf382"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.830112 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxnd6\" (UniqueName: \"kubernetes.io/projected/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-kube-api-access-kxnd6\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.830145 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.839015 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "592f8aa2-58a2-4c6d-b0d5-25c688ccf382" (UID: "592f8aa2-58a2-4c6d-b0d5-25c688ccf382"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.840599 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "592f8aa2-58a2-4c6d-b0d5-25c688ccf382" (UID: "592f8aa2-58a2-4c6d-b0d5-25c688ccf382"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.844374 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "592f8aa2-58a2-4c6d-b0d5-25c688ccf382" (UID: "592f8aa2-58a2-4c6d-b0d5-25c688ccf382"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.881155 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "592f8aa2-58a2-4c6d-b0d5-25c688ccf382" (UID: "592f8aa2-58a2-4c6d-b0d5-25c688ccf382"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.932173 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.932220 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.932280 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:35 crc kubenswrapper[4574]: I1004 05:05:35.932294 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/592f8aa2-58a2-4c6d-b0d5-25c688ccf382-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:36 crc kubenswrapper[4574]: E1004 05:05:36.198589 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.581681 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9be13b58-cf10-4e6e-ae12-1b93ab475796","Type":"ContainerStarted","Data":"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d"} Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.582163 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-log" containerID="cri-o://9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.582159 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-httpd" containerID="cri-o://086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.586066 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerStarted","Data":"27d1aa0b8d5de8019698b7ca9a5f14008aad06bbc72fb5f29883b3b2c887886a"} Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.586271 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="ceilometer-notification-agent" containerID="cri-o://75815cc997bbe7a9c11dd0a85638a1fb8145706b94cadf0a9ee36554be1f1b29" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.586372 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.586426 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="proxy-httpd" containerID="cri-o://27d1aa0b8d5de8019698b7ca9a5f14008aad06bbc72fb5f29883b3b2c887886a" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.586488 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="sg-core" containerID="cri-o://6bd0bbfe123436007d6ff33e2ea6122d521f5fc915f66e78b8f202915a0e0e32" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.592909 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c","Type":"ContainerStarted","Data":"c7892f7a5ba1551686ab73801accacf1ef1c272334f413fec1700936352e3459"} Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.593104 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-log" containerID="cri-o://732562073d51162f1ddb02b47ca02a27ebfda0eab4772fc38f7c564ba28e93ee" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.593272 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-httpd" containerID="cri-o://c7892f7a5ba1551686ab73801accacf1ef1c272334f413fec1700936352e3459" gracePeriod=30 Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.598380 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-wfxkf" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.598901 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pms9r" event={"ID":"097cde22-53c8-44ef-90c9-7e7dd7c43609","Type":"ContainerStarted","Data":"f29c105623a99459d5fc0228b756b82534fc6a9c427e0d68f6ba478858287c12"} Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.611023 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.611001853 podStartE2EDuration="14.611001853s" podCreationTimestamp="2025-10-04 05:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:36.605357013 +0000 UTC m=+1162.459500055" watchObservedRunningTime="2025-10-04 05:05:36.611001853 +0000 UTC m=+1162.465144905" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.642139 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.642108549 podStartE2EDuration="15.642108549s" podCreationTimestamp="2025-10-04 05:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:36.641188073 +0000 UTC m=+1162.495331225" watchObservedRunningTime="2025-10-04 05:05:36.642108549 +0000 UTC m=+1162.496251591" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.715614 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pms9r" podStartSLOduration=1.817400261 podStartE2EDuration="49.715590541s" podCreationTimestamp="2025-10-04 05:04:47 +0000 UTC" firstStartedPulling="2025-10-04 05:04:48.023978893 +0000 UTC m=+1113.878121945" lastFinishedPulling="2025-10-04 05:05:35.922169183 +0000 UTC m=+1161.776312225" observedRunningTime="2025-10-04 05:05:36.708847089 +0000 UTC m=+1162.562990141" watchObservedRunningTime="2025-10-04 05:05:36.715590541 +0000 UTC m=+1162.569733583" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.815074 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-78786b8bfb-qgltl" Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.830713 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-wfxkf"] Oct 04 05:05:36 crc kubenswrapper[4574]: I1004 05:05:36.839349 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-wfxkf"] Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.310131 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458322 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-config-data\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458636 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-scripts\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458680 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-logs\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458707 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458723 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-combined-ca-bundle\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458761 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h2sk\" (UniqueName: \"kubernetes.io/projected/9be13b58-cf10-4e6e-ae12-1b93ab475796-kube-api-access-2h2sk\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.458835 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-httpd-run\") pod \"9be13b58-cf10-4e6e-ae12-1b93ab475796\" (UID: \"9be13b58-cf10-4e6e-ae12-1b93ab475796\") " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.459579 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.460307 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-logs" (OuterVolumeSpecName: "logs") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.465663 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.468552 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-scripts" (OuterVolumeSpecName: "scripts") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.469510 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be13b58-cf10-4e6e-ae12-1b93ab475796-kube-api-access-2h2sk" (OuterVolumeSpecName: "kube-api-access-2h2sk") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "kube-api-access-2h2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.519527 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.545729 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-config-data" (OuterVolumeSpecName: "config-data") pod "9be13b58-cf10-4e6e-ae12-1b93ab475796" (UID: "9be13b58-cf10-4e6e-ae12-1b93ab475796"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561421 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561460 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561472 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561506 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561522 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be13b58-cf10-4e6e-ae12-1b93ab475796-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561534 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h2sk\" (UniqueName: \"kubernetes.io/projected/9be13b58-cf10-4e6e-ae12-1b93ab475796-kube-api-access-2h2sk\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.561545 4574 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9be13b58-cf10-4e6e-ae12-1b93ab475796-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.588710 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.612733 4574 generic.go:334] "Generic (PLEG): container finished" podID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerID="c7892f7a5ba1551686ab73801accacf1ef1c272334f413fec1700936352e3459" exitCode=143 Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.612767 4574 generic.go:334] "Generic (PLEG): container finished" podID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerID="732562073d51162f1ddb02b47ca02a27ebfda0eab4772fc38f7c564ba28e93ee" exitCode=143 Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.612806 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c","Type":"ContainerDied","Data":"c7892f7a5ba1551686ab73801accacf1ef1c272334f413fec1700936352e3459"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.612848 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c","Type":"ContainerDied","Data":"732562073d51162f1ddb02b47ca02a27ebfda0eab4772fc38f7c564ba28e93ee"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.614893 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-khrbr" event={"ID":"9bd3ebd3-498c-4070-9de7-eab9d2866108","Type":"ContainerStarted","Data":"e00a043640637a84e7524626f1b3dbf01348164f357bc15c5c2ccde54fb3dac2"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617304 4574 generic.go:334] "Generic (PLEG): container finished" podID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerID="086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d" exitCode=143 Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617334 4574 generic.go:334] "Generic (PLEG): container finished" podID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerID="9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae" exitCode=143 Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617354 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617398 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9be13b58-cf10-4e6e-ae12-1b93ab475796","Type":"ContainerDied","Data":"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617449 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9be13b58-cf10-4e6e-ae12-1b93ab475796","Type":"ContainerDied","Data":"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617464 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9be13b58-cf10-4e6e-ae12-1b93ab475796","Type":"ContainerDied","Data":"f068a5a7e508b3d0e17938f739ceb1ceee9ccd30f585d3b0955f06363b4c2f46"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.617481 4574 scope.go:117] "RemoveContainer" containerID="086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.619806 4574 generic.go:334] "Generic (PLEG): container finished" podID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerID="27d1aa0b8d5de8019698b7ca9a5f14008aad06bbc72fb5f29883b3b2c887886a" exitCode=0 Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.619840 4574 generic.go:334] "Generic (PLEG): container finished" podID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerID="6bd0bbfe123436007d6ff33e2ea6122d521f5fc915f66e78b8f202915a0e0e32" exitCode=2 Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.619861 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerDied","Data":"27d1aa0b8d5de8019698b7ca9a5f14008aad06bbc72fb5f29883b3b2c887886a"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.619885 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerDied","Data":"6bd0bbfe123436007d6ff33e2ea6122d521f5fc915f66e78b8f202915a0e0e32"} Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.653244 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-khrbr" podStartSLOduration=3.005937358 podStartE2EDuration="51.653210634s" podCreationTimestamp="2025-10-04 05:04:46 +0000 UTC" firstStartedPulling="2025-10-04 05:04:47.84543203 +0000 UTC m=+1113.699575072" lastFinishedPulling="2025-10-04 05:05:36.492705306 +0000 UTC m=+1162.346848348" observedRunningTime="2025-10-04 05:05:37.637102485 +0000 UTC m=+1163.491245527" watchObservedRunningTime="2025-10-04 05:05:37.653210634 +0000 UTC m=+1163.507353676" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.663387 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.663716 4574 scope.go:117] "RemoveContainer" containerID="9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.664483 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.671689 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.691588 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.691976 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-httpd" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.691997 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-httpd" Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.692014 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" containerName="init" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692021 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" containerName="init" Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.692037 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerName="dnsmasq-dns" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692047 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerName="dnsmasq-dns" Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.692082 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerName="init" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692090 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerName="init" Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.692101 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-log" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692108 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-log" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692325 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2e55c6-3f30-4c4d-a8dd-6952b97e4310" containerName="init" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692344 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-httpd" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692360 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" containerName="dnsmasq-dns" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.692373 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" containerName="glance-log" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.693275 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.702717 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.702752 4574 scope.go:117] "RemoveContainer" containerID="086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.702907 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.703174 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d\": container with ID starting with 086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d not found: ID does not exist" containerID="086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.703211 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d"} err="failed to get container status \"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d\": rpc error: code = NotFound desc = could not find container \"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d\": container with ID starting with 086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d not found: ID does not exist" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.703273 4574 scope.go:117] "RemoveContainer" containerID="9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae" Oct 04 05:05:37 crc kubenswrapper[4574]: E1004 05:05:37.703663 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae\": container with ID starting with 9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae not found: ID does not exist" containerID="9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.703699 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae"} err="failed to get container status \"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae\": rpc error: code = NotFound desc = could not find container \"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae\": container with ID starting with 9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae not found: ID does not exist" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.703726 4574 scope.go:117] "RemoveContainer" containerID="086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.705521 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d"} err="failed to get container status \"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d\": rpc error: code = NotFound desc = could not find container \"086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d\": container with ID starting with 086969974f989df7a7b1f55b1ba97602e70e745e71640bca4873ee42cfb5302d not found: ID does not exist" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.705550 4574 scope.go:117] "RemoveContainer" containerID="9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.708805 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae"} err="failed to get container status \"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae\": rpc error: code = NotFound desc = could not find container \"9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae\": container with ID starting with 9f0a54dbeae69a4c039c7512a5258ca649f734581830d495568c2561701308ae not found: ID does not exist" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.716070 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.765005 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.765589 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.765784 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-logs\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.765935 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.766133 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.766259 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.766361 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9h7s\" (UniqueName: \"kubernetes.io/projected/af57198a-6432-454b-ab0f-6e07c76f166b-kube-api-access-z9h7s\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.766477 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.877673 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-logs\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.877904 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878099 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878318 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9h7s\" (UniqueName: \"kubernetes.io/projected/af57198a-6432-454b-ab0f-6e07c76f166b-kube-api-access-z9h7s\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878427 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878558 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878652 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878737 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878342 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-logs\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.878914 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.884085 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.886444 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.895801 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.901287 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.905857 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9h7s\" (UniqueName: \"kubernetes.io/projected/af57198a-6432-454b-ab0f-6e07c76f166b-kube-api-access-z9h7s\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:37 crc kubenswrapper[4574]: I1004 05:05:37.939875 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.021851 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.358767 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503113 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-combined-ca-bundle\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503489 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-config-data\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503516 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-logs\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503591 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nd94\" (UniqueName: \"kubernetes.io/projected/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-kube-api-access-7nd94\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503649 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-scripts\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503688 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-httpd-run\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.503745 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\" (UID: \"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c\") " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.504783 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.505029 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-logs" (OuterVolumeSpecName: "logs") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.510686 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-scripts" (OuterVolumeSpecName: "scripts") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.518514 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.542892 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-kube-api-access-7nd94" (OuterVolumeSpecName: "kube-api-access-7nd94") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "kube-api-access-7nd94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.568276 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.605742 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nd94\" (UniqueName: \"kubernetes.io/projected/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-kube-api-access-7nd94\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.605772 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.605798 4574 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.605826 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.605835 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.605844 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.613523 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-config-data" (OuterVolumeSpecName: "config-data") pod "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" (UID: "95d8cf8d-7a8a-4987-8074-9fdb60b5d95c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.636801 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.645502 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.645387 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95d8cf8d-7a8a-4987-8074-9fdb60b5d95c","Type":"ContainerDied","Data":"9ac5d2d728f73e7b024f118d0343f5c3061b006a2d3b7c55965dfdb5aa1631a7"} Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.647173 4574 scope.go:117] "RemoveContainer" containerID="c7892f7a5ba1551686ab73801accacf1ef1c272334f413fec1700936352e3459" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.707337 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.707372 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.735072 4574 scope.go:117] "RemoveContainer" containerID="732562073d51162f1ddb02b47ca02a27ebfda0eab4772fc38f7c564ba28e93ee" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.752323 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592f8aa2-58a2-4c6d-b0d5-25c688ccf382" path="/var/lib/kubelet/pods/592f8aa2-58a2-4c6d-b0d5-25c688ccf382/volumes" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.753281 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be13b58-cf10-4e6e-ae12-1b93ab475796" path="/var/lib/kubelet/pods/9be13b58-cf10-4e6e-ae12-1b93ab475796/volumes" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.753928 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.753963 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.782464 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:38 crc kubenswrapper[4574]: E1004 05:05:38.782843 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-log" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.782860 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-log" Oct 04 05:05:38 crc kubenswrapper[4574]: E1004 05:05:38.782884 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-httpd" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.782890 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-httpd" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.783099 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-log" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.783116 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" containerName="glance-httpd" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.784149 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.788945 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.789204 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.840696 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.857470 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.911695 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-logs\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.911767 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.911892 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.911916 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.911946 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.911999 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.912174 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:38 crc kubenswrapper[4574]: I1004 05:05:38.912396 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ckb\" (UniqueName: \"kubernetes.io/projected/a8ebe95c-3128-46bd-8529-b87b860a6098-kube-api-access-n4ckb\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014046 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ckb\" (UniqueName: \"kubernetes.io/projected/a8ebe95c-3128-46bd-8529-b87b860a6098-kube-api-access-n4ckb\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014435 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-logs\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014471 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014556 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014582 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014616 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014656 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.014699 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.023608 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-logs\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.023860 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.023866 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.023995 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.036963 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.037017 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.037596 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.076650 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.108399 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ckb\" (UniqueName: \"kubernetes.io/projected/a8ebe95c-3128-46bd-8529-b87b860a6098-kube-api-access-n4ckb\") pod \"glance-default-external-api-0\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.402737 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.446156 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.447321 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.450855 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.451153 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.451318 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-v2szr" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.464327 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.527184 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdrh\" (UniqueName: \"kubernetes.io/projected/bde1eb16-1a52-4176-8143-e4f13507cb3b-kube-api-access-npdrh\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.527257 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.527312 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config-secret\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.527328 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.632947 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdrh\" (UniqueName: \"kubernetes.io/projected/bde1eb16-1a52-4176-8143-e4f13507cb3b-kube-api-access-npdrh\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.638071 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.638248 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config-secret\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.638276 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.639401 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.652961 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.653558 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config-secret\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.659820 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdrh\" (UniqueName: \"kubernetes.io/projected/bde1eb16-1a52-4176-8143-e4f13507cb3b-kube-api-access-npdrh\") pod \"openstackclient\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.674429 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af57198a-6432-454b-ab0f-6e07c76f166b","Type":"ContainerStarted","Data":"c9fcb5e4507a34e5517e20b17393a213380c47fb5e131c23b97658005fd612e0"} Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.775575 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.776335 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.807289 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.813347 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.814451 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.822996 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.946581 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbs6\" (UniqueName: \"kubernetes.io/projected/2552db74-0d8b-4ca0-af2e-092c03e097f2-kube-api-access-rmbs6\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.946849 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552db74-0d8b-4ca0-af2e-092c03e097f2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.946983 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2552db74-0d8b-4ca0-af2e-092c03e097f2-openstack-config-secret\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:39 crc kubenswrapper[4574]: I1004 05:05:39.947094 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2552db74-0d8b-4ca0-af2e-092c03e097f2-openstack-config\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: E1004 05:05:40.016557 4574 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 04 05:05:40 crc kubenswrapper[4574]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_bde1eb16-1a52-4176-8143-e4f13507cb3b_0(fdd1e9289745750c81d33f4d5a884c54f5672db0f22a9157dfc10c801629beca): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fdd1e9289745750c81d33f4d5a884c54f5672db0f22a9157dfc10c801629beca" Netns:"/var/run/netns/99fc0be7-92c8-47e7-b80c-babbdd7529d4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=fdd1e9289745750c81d33f4d5a884c54f5672db0f22a9157dfc10c801629beca;K8S_POD_UID=bde1eb16-1a52-4176-8143-e4f13507cb3b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/bde1eb16-1a52-4176-8143-e4f13507cb3b]: expected pod UID "bde1eb16-1a52-4176-8143-e4f13507cb3b" but got "2552db74-0d8b-4ca0-af2e-092c03e097f2" from Kube API Oct 04 05:05:40 crc kubenswrapper[4574]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 04 05:05:40 crc kubenswrapper[4574]: > Oct 04 05:05:40 crc kubenswrapper[4574]: E1004 05:05:40.022371 4574 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 04 05:05:40 crc kubenswrapper[4574]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_bde1eb16-1a52-4176-8143-e4f13507cb3b_0(fdd1e9289745750c81d33f4d5a884c54f5672db0f22a9157dfc10c801629beca): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fdd1e9289745750c81d33f4d5a884c54f5672db0f22a9157dfc10c801629beca" Netns:"/var/run/netns/99fc0be7-92c8-47e7-b80c-babbdd7529d4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=fdd1e9289745750c81d33f4d5a884c54f5672db0f22a9157dfc10c801629beca;K8S_POD_UID=bde1eb16-1a52-4176-8143-e4f13507cb3b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/bde1eb16-1a52-4176-8143-e4f13507cb3b]: expected pod UID "bde1eb16-1a52-4176-8143-e4f13507cb3b" but got "2552db74-0d8b-4ca0-af2e-092c03e097f2" from Kube API Oct 04 05:05:40 crc kubenswrapper[4574]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 04 05:05:40 crc kubenswrapper[4574]: > pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.048804 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbs6\" (UniqueName: \"kubernetes.io/projected/2552db74-0d8b-4ca0-af2e-092c03e097f2-kube-api-access-rmbs6\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.048867 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552db74-0d8b-4ca0-af2e-092c03e097f2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.048894 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2552db74-0d8b-4ca0-af2e-092c03e097f2-openstack-config-secret\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.048929 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2552db74-0d8b-4ca0-af2e-092c03e097f2-openstack-config\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.049684 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2552db74-0d8b-4ca0-af2e-092c03e097f2-openstack-config\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.055784 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2552db74-0d8b-4ca0-af2e-092c03e097f2-openstack-config-secret\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.065472 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2552db74-0d8b-4ca0-af2e-092c03e097f2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.081920 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbs6\" (UniqueName: \"kubernetes.io/projected/2552db74-0d8b-4ca0-af2e-092c03e097f2-kube-api-access-rmbs6\") pod \"openstackclient\" (UID: \"2552db74-0d8b-4ca0-af2e-092c03e097f2\") " pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.231602 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.272334 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: W1004 05:05:40.274566 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ebe95c_3128_46bd_8529_b87b860a6098.slice/crio-d20831b8584e14c5edeb86245bae8046b56d0720bbe4e9ecd21ba5932ce485d7 WatchSource:0}: Error finding container d20831b8584e14c5edeb86245bae8046b56d0720bbe4e9ecd21ba5932ce485d7: Status 404 returned error can't find the container with id d20831b8584e14c5edeb86245bae8046b56d0720bbe4e9ecd21ba5932ce485d7 Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.864310 4574 generic.go:334] "Generic (PLEG): container finished" podID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerID="75815cc997bbe7a9c11dd0a85638a1fb8145706b94cadf0a9ee36554be1f1b29" exitCode=0 Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.890219 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d8cf8d-7a8a-4987-8074-9fdb60b5d95c" path="/var/lib/kubelet/pods/95d8cf8d-7a8a-4987-8074-9fdb60b5d95c/volumes" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.891566 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerDied","Data":"75815cc997bbe7a9c11dd0a85638a1fb8145706b94cadf0a9ee36554be1f1b29"} Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.891596 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8ebe95c-3128-46bd-8529-b87b860a6098","Type":"ContainerStarted","Data":"d20831b8584e14c5edeb86245bae8046b56d0720bbe4e9ecd21ba5932ce485d7"} Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.912165 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.912633 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af57198a-6432-454b-ab0f-6e07c76f166b","Type":"ContainerStarted","Data":"aa20b5936822cce937ad9574e370a9a1cc746c8eed61f5e21dfe9f81965eeb99"} Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.923555 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.924182 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.963666 4574 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bde1eb16-1a52-4176-8143-e4f13507cb3b" podUID="2552db74-0d8b-4ca0-af2e-092c03e097f2" Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990418 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-run-httpd\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990475 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-config-data\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990498 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-log-httpd\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990547 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config-secret\") pod \"bde1eb16-1a52-4176-8143-e4f13507cb3b\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990626 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-combined-ca-bundle\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990644 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-sg-core-conf-yaml\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990702 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-scripts\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990751 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config\") pod \"bde1eb16-1a52-4176-8143-e4f13507cb3b\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990774 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npdrh\" (UniqueName: \"kubernetes.io/projected/bde1eb16-1a52-4176-8143-e4f13507cb3b-kube-api-access-npdrh\") pod \"bde1eb16-1a52-4176-8143-e4f13507cb3b\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990805 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgb6x\" (UniqueName: \"kubernetes.io/projected/f7cb1e07-7587-4b93-bf2f-a8229038b290-kube-api-access-vgb6x\") pod \"f7cb1e07-7587-4b93-bf2f-a8229038b290\" (UID: \"f7cb1e07-7587-4b93-bf2f-a8229038b290\") " Oct 04 05:05:40 crc kubenswrapper[4574]: I1004 05:05:40.990841 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-combined-ca-bundle\") pod \"bde1eb16-1a52-4176-8143-e4f13507cb3b\" (UID: \"bde1eb16-1a52-4176-8143-e4f13507cb3b\") " Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.001595 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bde1eb16-1a52-4176-8143-e4f13507cb3b" (UID: "bde1eb16-1a52-4176-8143-e4f13507cb3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.013898 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.016343 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.019525 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bde1eb16-1a52-4176-8143-e4f13507cb3b" (UID: "bde1eb16-1a52-4176-8143-e4f13507cb3b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.022931 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bde1eb16-1a52-4176-8143-e4f13507cb3b" (UID: "bde1eb16-1a52-4176-8143-e4f13507cb3b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.023687 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.041105 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-scripts" (OuterVolumeSpecName: "scripts") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.041962 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde1eb16-1a52-4176-8143-e4f13507cb3b-kube-api-access-npdrh" (OuterVolumeSpecName: "kube-api-access-npdrh") pod "bde1eb16-1a52-4176-8143-e4f13507cb3b" (UID: "bde1eb16-1a52-4176-8143-e4f13507cb3b"). InnerVolumeSpecName "kube-api-access-npdrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.042112 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.055740 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cb1e07-7587-4b93-bf2f-a8229038b290-kube-api-access-vgb6x" (OuterVolumeSpecName: "kube-api-access-vgb6x") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "kube-api-access-vgb6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095600 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095846 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095857 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095866 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npdrh\" (UniqueName: \"kubernetes.io/projected/bde1eb16-1a52-4176-8143-e4f13507cb3b-kube-api-access-npdrh\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095877 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgb6x\" (UniqueName: \"kubernetes.io/projected/f7cb1e07-7587-4b93-bf2f-a8229038b290-kube-api-access-vgb6x\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095886 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095894 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095901 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cb1e07-7587-4b93-bf2f-a8229038b290-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.095911 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bde1eb16-1a52-4176-8143-e4f13507cb3b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.106449 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.159638 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-config-data" (OuterVolumeSpecName: "config-data") pod "f7cb1e07-7587-4b93-bf2f-a8229038b290" (UID: "f7cb1e07-7587-4b93-bf2f-a8229038b290"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.197777 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.197801 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cb1e07-7587-4b93-bf2f-a8229038b290-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.924248 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cb1e07-7587-4b93-bf2f-a8229038b290","Type":"ContainerDied","Data":"365ba33ba08a7f9792e3356815ca11b94f454e6062fd4f1704e9fe0502b6f616"} Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.924577 4574 scope.go:117] "RemoveContainer" containerID="27d1aa0b8d5de8019698b7ca9a5f14008aad06bbc72fb5f29883b3b2c887886a" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.924299 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.935389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8ebe95c-3128-46bd-8529-b87b860a6098","Type":"ContainerStarted","Data":"c65a0665b928d1cf8419e151c5471d2eda1ec225554cc3eda9929279ea550887"} Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.939835 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af57198a-6432-454b-ab0f-6e07c76f166b","Type":"ContainerStarted","Data":"37ccbd2726c9a4b46c60ff716d803cf8f50d875d530c80d97e42832cc6d84a23"} Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.946565 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.946623 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2552db74-0d8b-4ca0-af2e-092c03e097f2","Type":"ContainerStarted","Data":"7ac41c3d64f77ce20c349a38a0b6e39308292ba6563dd6023f7fc86a58730576"} Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.967064 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.967043581 podStartE2EDuration="4.967043581s" podCreationTimestamp="2025-10-04 05:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:41.960219067 +0000 UTC m=+1167.814362109" watchObservedRunningTime="2025-10-04 05:05:41.967043581 +0000 UTC m=+1167.821186623" Oct 04 05:05:41 crc kubenswrapper[4574]: I1004 05:05:41.980569 4574 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bde1eb16-1a52-4176-8143-e4f13507cb3b" podUID="2552db74-0d8b-4ca0-af2e-092c03e097f2" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.001543 4574 scope.go:117] "RemoveContainer" containerID="6bd0bbfe123436007d6ff33e2ea6122d521f5fc915f66e78b8f202915a0e0e32" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.054534 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.075879 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.081876 4574 scope.go:117] "RemoveContainer" containerID="75815cc997bbe7a9c11dd0a85638a1fb8145706b94cadf0a9ee36554be1f1b29" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.099314 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:05:42 crc kubenswrapper[4574]: E1004 05:05:42.101520 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="proxy-httpd" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.101547 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="proxy-httpd" Oct 04 05:05:42 crc kubenswrapper[4574]: E1004 05:05:42.101573 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="ceilometer-notification-agent" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.101581 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="ceilometer-notification-agent" Oct 04 05:05:42 crc kubenswrapper[4574]: E1004 05:05:42.101619 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="sg-core" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.101629 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="sg-core" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.101828 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="ceilometer-notification-agent" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.101845 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="sg-core" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.101863 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" containerName="proxy-httpd" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.104887 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.110935 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.111156 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.125918 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.243669 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpdfc\" (UniqueName: \"kubernetes.io/projected/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-kube-api-access-zpdfc\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.244023 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.244108 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-config-data\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.244191 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-run-httpd\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.244207 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-log-httpd\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.244435 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-scripts\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.244532 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.347858 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdfc\" (UniqueName: \"kubernetes.io/projected/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-kube-api-access-zpdfc\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.348170 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.348194 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-config-data\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.348225 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-run-httpd\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.348307 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-log-httpd\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.348337 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-scripts\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.348372 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.349069 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-run-httpd\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.349627 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-log-httpd\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.354524 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-config-data\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.362886 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-scripts\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.382782 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.384017 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpdfc\" (UniqueName: \"kubernetes.io/projected/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-kube-api-access-zpdfc\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.402731 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.449888 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.747074 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde1eb16-1a52-4176-8143-e4f13507cb3b" path="/var/lib/kubelet/pods/bde1eb16-1a52-4176-8143-e4f13507cb3b/volumes" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.747878 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cb1e07-7587-4b93-bf2f-a8229038b290" path="/var/lib/kubelet/pods/f7cb1e07-7587-4b93-bf2f-a8229038b290/volumes" Oct 04 05:05:42 crc kubenswrapper[4574]: I1004 05:05:42.993696 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8ebe95c-3128-46bd-8529-b87b860a6098","Type":"ContainerStarted","Data":"b8aec1b4b90597eb94fb6acd08dfb480d7db36d403006e7977e040ab6158f37a"} Oct 04 05:05:43 crc kubenswrapper[4574]: I1004 05:05:43.027727 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.027710977 podStartE2EDuration="5.027710977s" podCreationTimestamp="2025-10-04 05:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:43.016600081 +0000 UTC m=+1168.870743113" watchObservedRunningTime="2025-10-04 05:05:43.027710977 +0000 UTC m=+1168.881854019" Oct 04 05:05:43 crc kubenswrapper[4574]: I1004 05:05:43.056502 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:05:43 crc kubenswrapper[4574]: E1004 05:05:43.129275 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod097cde22_53c8_44ef_90c9_7e7dd7c43609.slice/crio-f29c105623a99459d5fc0228b756b82534fc6a9c427e0d68f6ba478858287c12.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod097cde22_53c8_44ef_90c9_7e7dd7c43609.slice/crio-conmon-f29c105623a99459d5fc0228b756b82534fc6a9c427e0d68f6ba478858287c12.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:05:44 crc kubenswrapper[4574]: I1004 05:05:44.016553 4574 generic.go:334] "Generic (PLEG): container finished" podID="097cde22-53c8-44ef-90c9-7e7dd7c43609" containerID="f29c105623a99459d5fc0228b756b82534fc6a9c427e0d68f6ba478858287c12" exitCode=0 Oct 04 05:05:44 crc kubenswrapper[4574]: I1004 05:05:44.017246 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pms9r" event={"ID":"097cde22-53c8-44ef-90c9-7e7dd7c43609","Type":"ContainerDied","Data":"f29c105623a99459d5fc0228b756b82534fc6a9c427e0d68f6ba478858287c12"} Oct 04 05:05:44 crc kubenswrapper[4574]: I1004 05:05:44.024462 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerStarted","Data":"77a156f6abff746f8f7118780c377a3d23b535913ac83312482804706bc44b8b"} Oct 04 05:05:44 crc kubenswrapper[4574]: I1004 05:05:44.024526 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerStarted","Data":"ac9df1336282620c4b5d4b21ec53093ac2f06507b975a0ba0a59843eeb4a2ba2"} Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.038078 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerStarted","Data":"8070f6da02baa0889f79b66693906c23308b25c0e6038a5b5ed2aeda3dc08d7d"} Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.424879 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pms9r" Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.527441 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-combined-ca-bundle\") pod \"097cde22-53c8-44ef-90c9-7e7dd7c43609\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.527705 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlsc8\" (UniqueName: \"kubernetes.io/projected/097cde22-53c8-44ef-90c9-7e7dd7c43609-kube-api-access-tlsc8\") pod \"097cde22-53c8-44ef-90c9-7e7dd7c43609\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.528515 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-db-sync-config-data\") pod \"097cde22-53c8-44ef-90c9-7e7dd7c43609\" (UID: \"097cde22-53c8-44ef-90c9-7e7dd7c43609\") " Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.535014 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097cde22-53c8-44ef-90c9-7e7dd7c43609-kube-api-access-tlsc8" (OuterVolumeSpecName: "kube-api-access-tlsc8") pod "097cde22-53c8-44ef-90c9-7e7dd7c43609" (UID: "097cde22-53c8-44ef-90c9-7e7dd7c43609"). InnerVolumeSpecName "kube-api-access-tlsc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.544701 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "097cde22-53c8-44ef-90c9-7e7dd7c43609" (UID: "097cde22-53c8-44ef-90c9-7e7dd7c43609"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.574511 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "097cde22-53c8-44ef-90c9-7e7dd7c43609" (UID: "097cde22-53c8-44ef-90c9-7e7dd7c43609"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.630890 4574 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.630937 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097cde22-53c8-44ef-90c9-7e7dd7c43609-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:45 crc kubenswrapper[4574]: I1004 05:05:45.630951 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlsc8\" (UniqueName: \"kubernetes.io/projected/097cde22-53c8-44ef-90c9-7e7dd7c43609-kube-api-access-tlsc8\") on node \"crc\" DevicePath \"\"" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.054668 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pms9r" event={"ID":"097cde22-53c8-44ef-90c9-7e7dd7c43609","Type":"ContainerDied","Data":"7050a9f24d331ea45ce562aa819b77af260a849aa1ac8ec6260cfcb4d3f7db88"} Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.055038 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7050a9f24d331ea45ce562aa819b77af260a849aa1ac8ec6260cfcb4d3f7db88" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.055116 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pms9r" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.073467 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerStarted","Data":"3265cf1123ef8ef67b95a71de9ba52b89d7ed450840d39525d8c32c74c8f0da2"} Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.312676 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-86595bb85-v84cq"] Oct 04 05:05:46 crc kubenswrapper[4574]: E1004 05:05:46.313154 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097cde22-53c8-44ef-90c9-7e7dd7c43609" containerName="barbican-db-sync" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.313170 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="097cde22-53c8-44ef-90c9-7e7dd7c43609" containerName="barbican-db-sync" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.313487 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="097cde22-53c8-44ef-90c9-7e7dd7c43609" containerName="barbican-db-sync" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.316484 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.325927 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.326159 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rmg8w" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.331604 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.348451 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86595bb85-v84cq"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.398788 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5475848bb4-qk59c"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.404975 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448292 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-config-data\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448336 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-combined-ca-bundle\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448382 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-config-data\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448404 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/857fe45e-27ff-44ef-b58c-9e1278946927-logs\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448429 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-config-data-custom\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448457 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-config-data-custom\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448487 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-combined-ca-bundle\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448525 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppq8c\" (UniqueName: \"kubernetes.io/projected/857fe45e-27ff-44ef-b58c-9e1278946927-kube-api-access-ppq8c\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448568 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5438cd90-23bc-4da2-8856-519b7656f8ff-logs\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.448602 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5km4\" (UniqueName: \"kubernetes.io/projected/5438cd90-23bc-4da2-8856-519b7656f8ff-kube-api-access-f5km4\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.454183 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.509484 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5475848bb4-qk59c"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.551386 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-fgd5b"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.551404 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5km4\" (UniqueName: \"kubernetes.io/projected/5438cd90-23bc-4da2-8856-519b7656f8ff-kube-api-access-f5km4\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.551980 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-config-data\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552028 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-combined-ca-bundle\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552101 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-config-data\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552130 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/857fe45e-27ff-44ef-b58c-9e1278946927-logs\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552158 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-config-data-custom\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-config-data-custom\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552286 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-combined-ca-bundle\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552352 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppq8c\" (UniqueName: \"kubernetes.io/projected/857fe45e-27ff-44ef-b58c-9e1278946927-kube-api-access-ppq8c\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.552437 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5438cd90-23bc-4da2-8856-519b7656f8ff-logs\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.553153 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.560325 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5438cd90-23bc-4da2-8856-519b7656f8ff-logs\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.571489 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/857fe45e-27ff-44ef-b58c-9e1278946927-logs\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.574248 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-config-data\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.578033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-config-data-custom\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.583114 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-config-data\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.589413 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-fgd5b"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.615738 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-combined-ca-bundle\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.616478 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5438cd90-23bc-4da2-8856-519b7656f8ff-combined-ca-bundle\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.616479 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/857fe45e-27ff-44ef-b58c-9e1278946927-config-data-custom\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.618012 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppq8c\" (UniqueName: \"kubernetes.io/projected/857fe45e-27ff-44ef-b58c-9e1278946927-kube-api-access-ppq8c\") pod \"barbican-worker-86595bb85-v84cq\" (UID: \"857fe45e-27ff-44ef-b58c-9e1278946927\") " pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.636925 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86595bb85-v84cq" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.639096 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5km4\" (UniqueName: \"kubernetes.io/projected/5438cd90-23bc-4da2-8856-519b7656f8ff-kube-api-access-f5km4\") pod \"barbican-keystone-listener-5475848bb4-qk59c\" (UID: \"5438cd90-23bc-4da2-8856-519b7656f8ff\") " pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.657516 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l569k\" (UniqueName: \"kubernetes.io/projected/c95796cc-d004-4e3c-bfcb-38be356dbf92-kube-api-access-l569k\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.657866 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.657989 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-config\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.658066 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.658152 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.658272 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.691037 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-848d9d6b7d-xxqvb"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.694024 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.704490 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.713466 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-848d9d6b7d-xxqvb"] Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761559 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlccw\" (UniqueName: \"kubernetes.io/projected/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-kube-api-access-hlccw\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761656 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761719 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data-custom\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761751 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761848 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-logs\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761899 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-combined-ca-bundle\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761942 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l569k\" (UniqueName: \"kubernetes.io/projected/c95796cc-d004-4e3c-bfcb-38be356dbf92-kube-api-access-l569k\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.761975 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.762006 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.762038 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-config\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.762059 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.763267 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.765679 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.766076 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.766401 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-config\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.770577 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.772470 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.789830 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l569k\" (UniqueName: \"kubernetes.io/projected/c95796cc-d004-4e3c-bfcb-38be356dbf92-kube-api-access-l569k\") pod \"dnsmasq-dns-75c8ddd69c-fgd5b\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.879256 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-combined-ca-bundle\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.879672 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.879772 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlccw\" (UniqueName: \"kubernetes.io/projected/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-kube-api-access-hlccw\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.879919 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data-custom\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.891044 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-logs\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.891530 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-logs\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.898037 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.898831 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data-custom\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.907912 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-combined-ca-bundle\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:46 crc kubenswrapper[4574]: I1004 05:05:46.925834 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlccw\" (UniqueName: \"kubernetes.io/projected/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-kube-api-access-hlccw\") pod \"barbican-api-848d9d6b7d-xxqvb\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.025559 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.046528 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.482896 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5475848bb4-qk59c"] Oct 04 05:05:47 crc kubenswrapper[4574]: W1004 05:05:47.494306 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod857fe45e_27ff_44ef_b58c_9e1278946927.slice/crio-b06adf841588d003f0efd3e4c177dee850f2aabe843865fbf2fb008b68e9bca8 WatchSource:0}: Error finding container b06adf841588d003f0efd3e4c177dee850f2aabe843865fbf2fb008b68e9bca8: Status 404 returned error can't find the container with id b06adf841588d003f0efd3e4c177dee850f2aabe843865fbf2fb008b68e9bca8 Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.502480 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86595bb85-v84cq"] Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.677565 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.787727 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-fgd5b"] Oct 04 05:05:47 crc kubenswrapper[4574]: W1004 05:05:47.876387 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc95796cc_d004_4e3c_bfcb_38be356dbf92.slice/crio-540dc6b4978d474519d8a9b9c9fd64dab17c8a1074462c86955ad2ec756a4a15 WatchSource:0}: Error finding container 540dc6b4978d474519d8a9b9c9fd64dab17c8a1074462c86955ad2ec756a4a15: Status 404 returned error can't find the container with id 540dc6b4978d474519d8a9b9c9fd64dab17c8a1074462c86955ad2ec756a4a15 Oct 04 05:05:47 crc kubenswrapper[4574]: I1004 05:05:47.921413 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-848d9d6b7d-xxqvb"] Oct 04 05:05:47 crc kubenswrapper[4574]: W1004 05:05:47.953417 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871e935e_c9ec_4798_a014_6a9fe6cd1fbd.slice/crio-fdfb7bc705ef5afb36cc5dcab327f1c790613d249173d8fc5e174a40b687b12b WatchSource:0}: Error finding container fdfb7bc705ef5afb36cc5dcab327f1c790613d249173d8fc5e174a40b687b12b: Status 404 returned error can't find the container with id fdfb7bc705ef5afb36cc5dcab327f1c790613d249173d8fc5e174a40b687b12b Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.022543 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.023629 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.114832 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86595bb85-v84cq" event={"ID":"857fe45e-27ff-44ef-b58c-9e1278946927","Type":"ContainerStarted","Data":"b06adf841588d003f0efd3e4c177dee850f2aabe843865fbf2fb008b68e9bca8"} Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.120424 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-848d9d6b7d-xxqvb" event={"ID":"871e935e-c9ec-4798-a014-6a9fe6cd1fbd","Type":"ContainerStarted","Data":"fdfb7bc705ef5afb36cc5dcab327f1c790613d249173d8fc5e174a40b687b12b"} Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.141625 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerStarted","Data":"25dee4d6be9b32c6e76e1150a59a6eb4f9717d7c9a3353d402fa996fa6c44139"} Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.143404 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.153585 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" event={"ID":"c95796cc-d004-4e3c-bfcb-38be356dbf92","Type":"ContainerStarted","Data":"540dc6b4978d474519d8a9b9c9fd64dab17c8a1074462c86955ad2ec756a4a15"} Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.161106 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.168817 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" event={"ID":"5438cd90-23bc-4da2-8856-519b7656f8ff","Type":"ContainerStarted","Data":"4dd90f2a768e41d70df5b68d4b0a068c4614aa5abc8940a4322b9663dd47888c"} Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.168879 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.233426 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.479303453 podStartE2EDuration="6.233397317s" podCreationTimestamp="2025-10-04 05:05:42 +0000 UTC" firstStartedPulling="2025-10-04 05:05:43.084375211 +0000 UTC m=+1168.938518253" lastFinishedPulling="2025-10-04 05:05:46.838469075 +0000 UTC m=+1172.692612117" observedRunningTime="2025-10-04 05:05:48.209065894 +0000 UTC m=+1174.063208946" watchObservedRunningTime="2025-10-04 05:05:48.233397317 +0000 UTC m=+1174.087540409" Oct 04 05:05:48 crc kubenswrapper[4574]: I1004 05:05:48.497770 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.180192 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-848d9d6b7d-xxqvb" event={"ID":"871e935e-c9ec-4798-a014-6a9fe6cd1fbd","Type":"ContainerStarted","Data":"2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37"} Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.182282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" event={"ID":"c95796cc-d004-4e3c-bfcb-38be356dbf92","Type":"ContainerStarted","Data":"6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13"} Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.183189 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.404354 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.404395 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.465853 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.465995 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.713366 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7688fc9d67-qlxww"] Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.715133 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.717145 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.717175 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.721333 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.738321 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7688fc9d67-qlxww"] Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878439 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-public-tls-certs\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878498 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-config-data\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878628 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxrz\" (UniqueName: \"kubernetes.io/projected/710de145-ae9a-41bf-9b90-564a1e4acee6-kube-api-access-vpxrz\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878654 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/710de145-ae9a-41bf-9b90-564a1e4acee6-etc-swift\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878699 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/710de145-ae9a-41bf-9b90-564a1e4acee6-log-httpd\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878745 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-combined-ca-bundle\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878766 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-internal-tls-certs\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.878813 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/710de145-ae9a-41bf-9b90-564a1e4acee6-run-httpd\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985598 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxrz\" (UniqueName: \"kubernetes.io/projected/710de145-ae9a-41bf-9b90-564a1e4acee6-kube-api-access-vpxrz\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985665 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/710de145-ae9a-41bf-9b90-564a1e4acee6-etc-swift\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985752 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/710de145-ae9a-41bf-9b90-564a1e4acee6-log-httpd\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985834 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-combined-ca-bundle\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985862 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-internal-tls-certs\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985918 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/710de145-ae9a-41bf-9b90-564a1e4acee6-run-httpd\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985948 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-public-tls-certs\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.985977 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-config-data\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.986685 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/710de145-ae9a-41bf-9b90-564a1e4acee6-run-httpd\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.987299 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/710de145-ae9a-41bf-9b90-564a1e4acee6-log-httpd\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:49 crc kubenswrapper[4574]: I1004 05:05:49.993686 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-combined-ca-bundle\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.003206 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-internal-tls-certs\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.010143 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-public-tls-certs\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.019547 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710de145-ae9a-41bf-9b90-564a1e4acee6-config-data\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.020611 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/710de145-ae9a-41bf-9b90-564a1e4acee6-etc-swift\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.023954 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxrz\" (UniqueName: \"kubernetes.io/projected/710de145-ae9a-41bf-9b90-564a1e4acee6-kube-api-access-vpxrz\") pod \"swift-proxy-7688fc9d67-qlxww\" (UID: \"710de145-ae9a-41bf-9b90-564a1e4acee6\") " pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.041142 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.210912 4574 generic.go:334] "Generic (PLEG): container finished" podID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerID="3e4dc4fc365b9ba947066873c9c3d152cb971dadf939b36d9d774912264c3816" exitCode=137 Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.211001 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerDied","Data":"3e4dc4fc365b9ba947066873c9c3d152cb971dadf939b36d9d774912264c3816"} Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.218770 4574 generic.go:334] "Generic (PLEG): container finished" podID="85281a42-f9ab-4302-9fe9-4e742075530f" containerID="bafa808bdf2a35dee0e61ef90e7b8b4999e39f07d3cde96c5386527343a5b987" exitCode=137 Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.218883 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerDied","Data":"bafa808bdf2a35dee0e61ef90e7b8b4999e39f07d3cde96c5386527343a5b987"} Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.223275 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-766d778598-9bz6b"] Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.226101 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.233550 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.234116 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.233579 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-848d9d6b7d-xxqvb" event={"ID":"871e935e-c9ec-4798-a014-6a9fe6cd1fbd","Type":"ContainerStarted","Data":"431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858"} Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.234351 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.234402 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.237931 4574 generic.go:334] "Generic (PLEG): container finished" podID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerID="6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13" exitCode=0 Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.240769 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" event={"ID":"c95796cc-d004-4e3c-bfcb-38be356dbf92","Type":"ContainerDied","Data":"6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13"} Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.242819 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.243608 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.243647 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.263603 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-766d778598-9bz6b"] Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.294684 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-logs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.294755 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-config-data\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.295004 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-config-data-custom\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.295101 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-combined-ca-bundle\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.296370 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-public-tls-certs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.296465 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-internal-tls-certs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.296539 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcdx9\" (UniqueName: \"kubernetes.io/projected/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-kube-api-access-dcdx9\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.347348 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-848d9d6b7d-xxqvb" podStartSLOduration=4.347328068 podStartE2EDuration="4.347328068s" podCreationTimestamp="2025-10-04 05:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:05:50.337442356 +0000 UTC m=+1176.191585428" watchObservedRunningTime="2025-10-04 05:05:50.347328068 +0000 UTC m=+1176.201471110" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.398259 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-combined-ca-bundle\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.398606 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-public-tls-certs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.398798 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-internal-tls-certs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.398914 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcdx9\" (UniqueName: \"kubernetes.io/projected/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-kube-api-access-dcdx9\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.399492 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-logs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.399649 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-config-data\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.399954 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-config-data-custom\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.401872 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-logs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.405190 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-public-tls-certs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.407421 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-config-data-custom\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.408697 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-config-data\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.409739 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-internal-tls-certs\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.416581 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-combined-ca-bundle\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.427962 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcdx9\" (UniqueName: \"kubernetes.io/projected/c224adb6-7a04-4bd4-bc6a-d8c484c8710e-kube-api-access-dcdx9\") pod \"barbican-api-766d778598-9bz6b\" (UID: \"c224adb6-7a04-4bd4-bc6a-d8c484c8710e\") " pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.559259 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.779169 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54746bc5fc-22pbj" Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.874784 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fdd4f7798-vj9tl"] Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.875001 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fdd4f7798-vj9tl" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-api" containerID="cri-o://655dae3bd54eaa26e8f66741694c4f05a6d40744481378a1a8f28f6c7e36ea08" gracePeriod=30 Oct 04 05:05:50 crc kubenswrapper[4574]: I1004 05:05:50.875307 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fdd4f7798-vj9tl" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-httpd" containerID="cri-o://67d119404faa86d73dc38c89ef684141fecea4762f7fa63a1580749cea1c68c2" gracePeriod=30 Oct 04 05:05:52 crc kubenswrapper[4574]: I1004 05:05:52.275572 4574 generic.go:334] "Generic (PLEG): container finished" podID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerID="67d119404faa86d73dc38c89ef684141fecea4762f7fa63a1580749cea1c68c2" exitCode=0 Oct 04 05:05:52 crc kubenswrapper[4574]: I1004 05:05:52.276074 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fdd4f7798-vj9tl" event={"ID":"cec43bff-ec9c-4c1f-975c-85c3292c3458","Type":"ContainerDied","Data":"67d119404faa86d73dc38c89ef684141fecea4762f7fa63a1580749cea1c68c2"} Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.222015 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.222372 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-central-agent" containerID="cri-o://77a156f6abff746f8f7118780c377a3d23b535913ac83312482804706bc44b8b" gracePeriod=30 Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.222533 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="proxy-httpd" containerID="cri-o://25dee4d6be9b32c6e76e1150a59a6eb4f9717d7c9a3353d402fa996fa6c44139" gracePeriod=30 Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.222600 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="sg-core" containerID="cri-o://3265cf1123ef8ef67b95a71de9ba52b89d7ed450840d39525d8c32c74c8f0da2" gracePeriod=30 Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.222662 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-notification-agent" containerID="cri-o://8070f6da02baa0889f79b66693906c23308b25c0e6038a5b5ed2aeda3dc08d7d" gracePeriod=30 Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.291092 4574 generic.go:334] "Generic (PLEG): container finished" podID="9bd3ebd3-498c-4070-9de7-eab9d2866108" containerID="e00a043640637a84e7524626f1b3dbf01348164f357bc15c5c2ccde54fb3dac2" exitCode=0 Oct 04 05:05:53 crc kubenswrapper[4574]: I1004 05:05:53.291158 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-khrbr" event={"ID":"9bd3ebd3-498c-4070-9de7-eab9d2866108","Type":"ContainerDied","Data":"e00a043640637a84e7524626f1b3dbf01348164f357bc15c5c2ccde54fb3dac2"} Oct 04 05:05:53 crc kubenswrapper[4574]: E1004 05:05:53.471184 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6bc9eae_4c64_4f73_9d33_f0fcd6655845.slice/crio-25dee4d6be9b32c6e76e1150a59a6eb4f9717d7c9a3353d402fa996fa6c44139.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.305887 4574 generic.go:334] "Generic (PLEG): container finished" podID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerID="25dee4d6be9b32c6e76e1150a59a6eb4f9717d7c9a3353d402fa996fa6c44139" exitCode=0 Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306290 4574 generic.go:334] "Generic (PLEG): container finished" podID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerID="3265cf1123ef8ef67b95a71de9ba52b89d7ed450840d39525d8c32c74c8f0da2" exitCode=2 Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306329 4574 generic.go:334] "Generic (PLEG): container finished" podID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerID="8070f6da02baa0889f79b66693906c23308b25c0e6038a5b5ed2aeda3dc08d7d" exitCode=0 Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306339 4574 generic.go:334] "Generic (PLEG): container finished" podID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerID="77a156f6abff746f8f7118780c377a3d23b535913ac83312482804706bc44b8b" exitCode=0 Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306138 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerDied","Data":"25dee4d6be9b32c6e76e1150a59a6eb4f9717d7c9a3353d402fa996fa6c44139"} Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306445 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerDied","Data":"3265cf1123ef8ef67b95a71de9ba52b89d7ed450840d39525d8c32c74c8f0da2"} Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306472 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerDied","Data":"8070f6da02baa0889f79b66693906c23308b25c0e6038a5b5ed2aeda3dc08d7d"} Oct 04 05:05:54 crc kubenswrapper[4574]: I1004 05:05:54.306489 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerDied","Data":"77a156f6abff746f8f7118780c377a3d23b535913ac83312482804706bc44b8b"} Oct 04 05:05:55 crc kubenswrapper[4574]: I1004 05:05:55.099706 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 04 05:05:55 crc kubenswrapper[4574]: I1004 05:05:55.100117 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 05:05:55 crc kubenswrapper[4574]: I1004 05:05:55.110262 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 04 05:05:55 crc kubenswrapper[4574]: I1004 05:05:55.211008 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:55 crc kubenswrapper[4574]: I1004 05:05:55.211126 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 05:05:55 crc kubenswrapper[4574]: I1004 05:05:55.216393 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 04 05:05:57 crc kubenswrapper[4574]: I1004 05:05:57.351963 4574 generic.go:334] "Generic (PLEG): container finished" podID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerID="655dae3bd54eaa26e8f66741694c4f05a6d40744481378a1a8f28f6c7e36ea08" exitCode=0 Oct 04 05:05:57 crc kubenswrapper[4574]: I1004 05:05:57.352034 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fdd4f7798-vj9tl" event={"ID":"cec43bff-ec9c-4c1f-975c-85c3292c3458","Type":"ContainerDied","Data":"655dae3bd54eaa26e8f66741694c4f05a6d40744481378a1a8f28f6c7e36ea08"} Oct 04 05:05:59 crc kubenswrapper[4574]: I1004 05:05:59.180834 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:05:59 crc kubenswrapper[4574]: I1004 05:05:59.370115 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:06:00 crc kubenswrapper[4574]: I1004 05:06:00.858412 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zrf8p"] Oct 04 05:06:00 crc kubenswrapper[4574]: I1004 05:06:00.860102 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:00 crc kubenswrapper[4574]: I1004 05:06:00.875978 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zrf8p"] Oct 04 05:06:00 crc kubenswrapper[4574]: I1004 05:06:00.949328 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8lmh\" (UniqueName: \"kubernetes.io/projected/20ee193e-b13b-4da7-8f59-a438c1c787c7-kube-api-access-v8lmh\") pod \"nova-api-db-create-zrf8p\" (UID: \"20ee193e-b13b-4da7-8f59-a438c1c787c7\") " pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.045363 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tgzwb"] Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.046549 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.051509 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8lmh\" (UniqueName: \"kubernetes.io/projected/20ee193e-b13b-4da7-8f59-a438c1c787c7-kube-api-access-v8lmh\") pod \"nova-api-db-create-zrf8p\" (UID: \"20ee193e-b13b-4da7-8f59-a438c1c787c7\") " pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.055284 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgzwb"] Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.074781 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8lmh\" (UniqueName: \"kubernetes.io/projected/20ee193e-b13b-4da7-8f59-a438c1c787c7-kube-api-access-v8lmh\") pod \"nova-api-db-create-zrf8p\" (UID: \"20ee193e-b13b-4da7-8f59-a438c1c787c7\") " pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.156676 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnblx\" (UniqueName: \"kubernetes.io/projected/176077ad-74a8-403d-8917-8288171aa8d4-kube-api-access-vnblx\") pod \"nova-cell0-db-create-tgzwb\" (UID: \"176077ad-74a8-403d-8917-8288171aa8d4\") " pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.184954 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.241554 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tr5bq"] Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.242721 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.258390 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnblx\" (UniqueName: \"kubernetes.io/projected/176077ad-74a8-403d-8917-8288171aa8d4-kube-api-access-vnblx\") pod \"nova-cell0-db-create-tgzwb\" (UID: \"176077ad-74a8-403d-8917-8288171aa8d4\") " pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.267633 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tr5bq"] Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.298211 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnblx\" (UniqueName: \"kubernetes.io/projected/176077ad-74a8-403d-8917-8288171aa8d4-kube-api-access-vnblx\") pod \"nova-cell0-db-create-tgzwb\" (UID: \"176077ad-74a8-403d-8917-8288171aa8d4\") " pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.360476 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptq7t\" (UniqueName: \"kubernetes.io/projected/4e5fe7de-58bc-4111-97a0-662294fd048b-kube-api-access-ptq7t\") pod \"nova-cell1-db-create-tr5bq\" (UID: \"4e5fe7de-58bc-4111-97a0-662294fd048b\") " pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.364151 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.462247 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptq7t\" (UniqueName: \"kubernetes.io/projected/4e5fe7de-58bc-4111-97a0-662294fd048b-kube-api-access-ptq7t\") pod \"nova-cell1-db-create-tr5bq\" (UID: \"4e5fe7de-58bc-4111-97a0-662294fd048b\") " pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.481207 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptq7t\" (UniqueName: \"kubernetes.io/projected/4e5fe7de-58bc-4111-97a0-662294fd048b-kube-api-access-ptq7t\") pod \"nova-cell1-db-create-tr5bq\" (UID: \"4e5fe7de-58bc-4111-97a0-662294fd048b\") " pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:01 crc kubenswrapper[4574]: I1004 05:06:01.569248 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:02 crc kubenswrapper[4574]: E1004 05:06:02.431846 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Oct 04 05:06:02 crc kubenswrapper[4574]: E1004 05:06:02.432382 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb4h677h58bh694h558h667h686h65h576hf8h5fdh56chf5hc4h645h68ch59dhb7h687h57hd9h578h5f6h97h5hf8h57h564h6h5c6h67ch546q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rmbs6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(2552db74-0d8b-4ca0-af2e-092c03e097f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:06:02 crc kubenswrapper[4574]: E1004 05:06:02.433941 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="2552db74-0d8b-4ca0-af2e-092c03e097f2" Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.723079 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-khrbr" Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.892585 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-scripts\") pod \"9bd3ebd3-498c-4070-9de7-eab9d2866108\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.892654 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-combined-ca-bundle\") pod \"9bd3ebd3-498c-4070-9de7-eab9d2866108\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.892726 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bd3ebd3-498c-4070-9de7-eab9d2866108-etc-machine-id\") pod \"9bd3ebd3-498c-4070-9de7-eab9d2866108\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.892795 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxlrl\" (UniqueName: \"kubernetes.io/projected/9bd3ebd3-498c-4070-9de7-eab9d2866108-kube-api-access-lxlrl\") pod \"9bd3ebd3-498c-4070-9de7-eab9d2866108\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.893049 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-db-sync-config-data\") pod \"9bd3ebd3-498c-4070-9de7-eab9d2866108\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.893179 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-config-data\") pod \"9bd3ebd3-498c-4070-9de7-eab9d2866108\" (UID: \"9bd3ebd3-498c-4070-9de7-eab9d2866108\") " Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.893914 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bd3ebd3-498c-4070-9de7-eab9d2866108-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9bd3ebd3-498c-4070-9de7-eab9d2866108" (UID: "9bd3ebd3-498c-4070-9de7-eab9d2866108"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.894486 4574 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bd3ebd3-498c-4070-9de7-eab9d2866108-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.901745 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd3ebd3-498c-4070-9de7-eab9d2866108-kube-api-access-lxlrl" (OuterVolumeSpecName: "kube-api-access-lxlrl") pod "9bd3ebd3-498c-4070-9de7-eab9d2866108" (UID: "9bd3ebd3-498c-4070-9de7-eab9d2866108"). InnerVolumeSpecName "kube-api-access-lxlrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.903627 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-scripts" (OuterVolumeSpecName: "scripts") pod "9bd3ebd3-498c-4070-9de7-eab9d2866108" (UID: "9bd3ebd3-498c-4070-9de7-eab9d2866108"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:02 crc kubenswrapper[4574]: I1004 05:06:02.903776 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9bd3ebd3-498c-4070-9de7-eab9d2866108" (UID: "9bd3ebd3-498c-4070-9de7-eab9d2866108"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.008143 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.008533 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxlrl\" (UniqueName: \"kubernetes.io/projected/9bd3ebd3-498c-4070-9de7-eab9d2866108-kube-api-access-lxlrl\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.008549 4574 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.043776 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bd3ebd3-498c-4070-9de7-eab9d2866108" (UID: "9bd3ebd3-498c-4070-9de7-eab9d2866108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.116832 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.174041 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-config-data" (OuterVolumeSpecName: "config-data") pod "9bd3ebd3-498c-4070-9de7-eab9d2866108" (UID: "9bd3ebd3-498c-4070-9de7-eab9d2866108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.220006 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd3ebd3-498c-4070-9de7-eab9d2866108-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.345642 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7688fc9d67-qlxww"] Oct 04 05:06:03 crc kubenswrapper[4574]: W1004 05:06:03.379398 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod710de145_ae9a_41bf_9b90_564a1e4acee6.slice/crio-2ddee03f8745ed894546ea9412bd631173f90c252b3717380fe9e40ea1df378c WatchSource:0}: Error finding container 2ddee03f8745ed894546ea9412bd631173f90c252b3717380fe9e40ea1df378c: Status 404 returned error can't find the container with id 2ddee03f8745ed894546ea9412bd631173f90c252b3717380fe9e40ea1df378c Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.393058 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.472194 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-khrbr" event={"ID":"9bd3ebd3-498c-4070-9de7-eab9d2866108","Type":"ContainerDied","Data":"dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd"} Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.472244 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.472304 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-khrbr" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.490774 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fdd4f7798-vj9tl" event={"ID":"cec43bff-ec9c-4c1f-975c-85c3292c3458","Type":"ContainerDied","Data":"2779e3e30c1951ca8b280e1f7f039da0646b0d1ee198fa4e77cb8170a577876f"} Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.490820 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2779e3e30c1951ca8b280e1f7f039da0646b0d1ee198fa4e77cb8170a577876f" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.493108 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7688fc9d67-qlxww" event={"ID":"710de145-ae9a-41bf-9b90-564a1e4acee6","Type":"ContainerStarted","Data":"2ddee03f8745ed894546ea9412bd631173f90c252b3717380fe9e40ea1df378c"} Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.507675 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6bc9eae-4c64-4f73-9d33-f0fcd6655845","Type":"ContainerDied","Data":"ac9df1336282620c4b5d4b21ec53093ac2f06507b975a0ba0a59843eeb4a2ba2"} Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.507752 4574 scope.go:117] "RemoveContainer" containerID="25dee4d6be9b32c6e76e1150a59a6eb4f9717d7c9a3353d402fa996fa6c44139" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.507705 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.519065 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" event={"ID":"c95796cc-d004-4e3c-bfcb-38be356dbf92","Type":"ContainerStarted","Data":"74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c"} Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.519110 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:06:03 crc kubenswrapper[4574]: E1004 05:06:03.520973 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="2552db74-0d8b-4ca0-af2e-092c03e097f2" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526207 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-config-data\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526281 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-sg-core-conf-yaml\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526348 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-log-httpd\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526383 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-scripts\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526745 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpdfc\" (UniqueName: \"kubernetes.io/projected/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-kube-api-access-zpdfc\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526817 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-combined-ca-bundle\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.526836 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-run-httpd\") pod \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\" (UID: \"d6bc9eae-4c64-4f73-9d33-f0fcd6655845\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.527786 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.534484 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.558814 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-scripts" (OuterVolumeSpecName: "scripts") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.558950 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-kube-api-access-zpdfc" (OuterVolumeSpecName: "kube-api-access-zpdfc") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "kube-api-access-zpdfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.592373 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-766d778598-9bz6b"] Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.593474 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.600988 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" podStartSLOduration=17.600965762 podStartE2EDuration="17.600965762s" podCreationTimestamp="2025-10-04 05:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:03.581464146 +0000 UTC m=+1189.435607188" watchObservedRunningTime="2025-10-04 05:06:03.600965762 +0000 UTC m=+1189.455108794" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.629578 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.629612 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.629626 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.629639 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.629651 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpdfc\" (UniqueName: \"kubernetes.io/projected/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-kube-api-access-zpdfc\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: W1004 05:06:03.638164 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc224adb6_7a04_4bd4_bc6a_d8c484c8710e.slice/crio-999141e2ecc30a6e056bf68a10a48adb1bfabc72531d37c8a0319ae3b3a6c760 WatchSource:0}: Error finding container 999141e2ecc30a6e056bf68a10a48adb1bfabc72531d37c8a0319ae3b3a6c760: Status 404 returned error can't find the container with id 999141e2ecc30a6e056bf68a10a48adb1bfabc72531d37c8a0319ae3b3a6c760 Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.711455 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-config-data" (OuterVolumeSpecName: "config-data") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.733331 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.809509 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6bc9eae-4c64-4f73-9d33-f0fcd6655845" (UID: "d6bc9eae-4c64-4f73-9d33-f0fcd6655845"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.820319 4574 scope.go:117] "RemoveContainer" containerID="3265cf1123ef8ef67b95a71de9ba52b89d7ed450840d39525d8c32c74c8f0da2" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.835319 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bc9eae-4c64-4f73-9d33-f0fcd6655845-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.857825 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.901908 4574 scope.go:117] "RemoveContainer" containerID="8070f6da02baa0889f79b66693906c23308b25c0e6038a5b5ed2aeda3dc08d7d" Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.937029 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-ovndb-tls-certs\") pod \"cec43bff-ec9c-4c1f-975c-85c3292c3458\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.937194 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxp9q\" (UniqueName: \"kubernetes.io/projected/cec43bff-ec9c-4c1f-975c-85c3292c3458-kube-api-access-zxp9q\") pod \"cec43bff-ec9c-4c1f-975c-85c3292c3458\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.937339 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-config\") pod \"cec43bff-ec9c-4c1f-975c-85c3292c3458\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.937508 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-combined-ca-bundle\") pod \"cec43bff-ec9c-4c1f-975c-85c3292c3458\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.937544 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-httpd-config\") pod \"cec43bff-ec9c-4c1f-975c-85c3292c3458\" (UID: \"cec43bff-ec9c-4c1f-975c-85c3292c3458\") " Oct 04 05:06:03 crc kubenswrapper[4574]: I1004 05:06:03.961478 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:03.998733 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.033528 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec43bff-ec9c-4c1f-975c-85c3292c3458-kube-api-access-zxp9q" (OuterVolumeSpecName: "kube-api-access-zxp9q") pod "cec43bff-ec9c-4c1f-975c-85c3292c3458" (UID: "cec43bff-ec9c-4c1f-975c-85c3292c3458"). InnerVolumeSpecName "kube-api-access-zxp9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.041715 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxp9q\" (UniqueName: \"kubernetes.io/projected/cec43bff-ec9c-4c1f-975c-85c3292c3458-kube-api-access-zxp9q\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.046655 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cec43bff-ec9c-4c1f-975c-85c3292c3458" (UID: "cec43bff-ec9c-4c1f-975c-85c3292c3458"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.100685 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zrf8p"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.116190 4574 scope.go:117] "RemoveContainer" containerID="77a156f6abff746f8f7118780c377a3d23b535913ac83312482804706bc44b8b" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.184757 4574 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.224830 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.243847 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-notification-agent" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.243887 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-notification-agent" Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.243909 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="sg-core" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.243917 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="sg-core" Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.243955 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd3ebd3-498c-4070-9de7-eab9d2866108" containerName="cinder-db-sync" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.243962 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd3ebd3-498c-4070-9de7-eab9d2866108" containerName="cinder-db-sync" Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.243981 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-httpd" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.243987 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-httpd" Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.244016 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="proxy-httpd" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244043 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="proxy-httpd" Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.244052 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-central-agent" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244059 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-central-agent" Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.244089 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-api" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244097 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-api" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244627 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-central-agent" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244650 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="proxy-httpd" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244663 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-api" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244681 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="ceilometer-notification-agent" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244692 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" containerName="neutron-httpd" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244709 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" containerName="sg-core" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.244716 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd3ebd3-498c-4070-9de7-eab9d2866108" containerName="cinder-db-sync" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.269719 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.312002 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.312266 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.330426 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgzwb"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.370990 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.371153 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.371752 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-config-data\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.371827 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wqz\" (UniqueName: \"kubernetes.io/projected/8cb7137b-948e-40ee-9424-267fdeb9e1c0-kube-api-access-74wqz\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.371900 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-scripts\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.371943 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.372044 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.415096 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:04 crc kubenswrapper[4574]: E1004 05:06:04.449826 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bd3ebd3_498c_4070_9de7_eab9d2866108.slice/crio-dfea24e32f2d2c9b15bbc9ff2082cc64c771952182e8954a5c18d683e49f62fd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6bc9eae_4c64_4f73_9d33_f0fcd6655845.slice/crio-ac9df1336282620c4b5d4b21ec53093ac2f06507b975a0ba0a59843eeb4a2ba2\": RecentStats: unable to find data in memory cache]" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.473414 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.474530 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.474724 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-config-data\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.474866 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wqz\" (UniqueName: \"kubernetes.io/projected/8cb7137b-948e-40ee-9424-267fdeb9e1c0-kube-api-access-74wqz\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.474987 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-scripts\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.475080 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.475198 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.473621 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tr5bq"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.481960 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-run-httpd\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.482357 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-log-httpd\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.534593 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.540367 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wqz\" (UniqueName: \"kubernetes.io/projected/8cb7137b-948e-40ee-9424-267fdeb9e1c0-kube-api-access-74wqz\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.574442 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86595bb85-v84cq" event={"ID":"857fe45e-27ff-44ef-b58c-9e1278946927","Type":"ContainerStarted","Data":"6046755bd3f7290dea67c7b675a702d1cb88d1b555b369e7473f1b9bc4f20fa7"} Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.587444 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.589454 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.593834 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zbcn2" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.594018 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.594369 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.600190 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.620313 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.628094 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerStarted","Data":"6d70675baea48ecf302727fcdbee1acb4f0596c28e0e43a0d20ca60fc171a6ba"} Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.628903 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.630001 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-config-data\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.631560 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-scripts\") pod \"ceilometer-0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.646711 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.650006 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerStarted","Data":"7e11226527b91eec1e02086808d22606900e396ca4000781b4ed8449905f45e8"} Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.651672 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-fgd5b"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.670475 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bcjvm"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.672447 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.684329 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766d778598-9bz6b" event={"ID":"c224adb6-7a04-4bd4-bc6a-d8c484c8710e","Type":"ContainerStarted","Data":"999141e2ecc30a6e056bf68a10a48adb1bfabc72531d37c8a0319ae3b3a6c760"} Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.699797 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bcjvm"] Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.700877 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zrf8p" event={"ID":"20ee193e-b13b-4da7-8f59-a438c1c787c7","Type":"ContainerStarted","Data":"57f758ae17c3ca213b3717aa90cec1f4068d8aaf3633774198c5f3e8c69e4d3f"} Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.750471 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fdd4f7798-vj9tl" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.764977 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6bc9eae-4c64-4f73-9d33-f0fcd6655845" path="/var/lib/kubelet/pods/d6bc9eae-4c64-4f73-9d33-f0fcd6655845/volumes" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782634 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782692 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782720 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782765 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-svc\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782790 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/d552b9ab-5c77-4524-b268-ea27f9a661c4-kube-api-access-5j7f6\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782806 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-config\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782828 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782895 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-scripts\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782918 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.782939 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.783088 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8mv\" (UniqueName: \"kubernetes.io/projected/b6ee152d-8343-47d6-8a16-cfed435bee04-kube-api-access-rx8mv\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.783106 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d552b9ab-5c77-4524-b268-ea27f9a661c4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885470 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8mv\" (UniqueName: \"kubernetes.io/projected/b6ee152d-8343-47d6-8a16-cfed435bee04-kube-api-access-rx8mv\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885776 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d552b9ab-5c77-4524-b268-ea27f9a661c4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885802 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885821 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885845 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885887 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-svc\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885913 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/d552b9ab-5c77-4524-b268-ea27f9a661c4-kube-api-access-5j7f6\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885931 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-config\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885950 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.885978 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-scripts\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.886002 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.886024 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.887112 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.887408 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d552b9ab-5c77-4524-b268-ea27f9a661c4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.887964 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.889354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-config\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.897216 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-svc\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.899791 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.909495 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.911917 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8mv\" (UniqueName: \"kubernetes.io/projected/b6ee152d-8343-47d6-8a16-cfed435bee04-kube-api-access-rx8mv\") pod \"dnsmasq-dns-5784cf869f-bcjvm\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.917588 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-scripts\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.925195 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/d552b9ab-5c77-4524-b268-ea27f9a661c4-kube-api-access-5j7f6\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.926105 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:04 crc kubenswrapper[4574]: I1004 05:06:04.932868 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.479646 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-config" (OuterVolumeSpecName: "config") pod "cec43bff-ec9c-4c1f-975c-85c3292c3458" (UID: "cec43bff-ec9c-4c1f-975c-85c3292c3458"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:05 crc kubenswrapper[4574]: W1004 05:06:05.497971 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cb7137b_948e_40ee_9424_267fdeb9e1c0.slice/crio-b15ae781cf81c396c20707d63ed388bb2682c5437fba92e132db24511c2e539d WatchSource:0}: Error finding container b15ae781cf81c396c20707d63ed388bb2682c5437fba92e132db24511c2e539d: Status 404 returned error can't find the container with id b15ae781cf81c396c20707d63ed388bb2682c5437fba92e132db24511c2e539d Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.531971 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cec43bff-ec9c-4c1f-975c-85c3292c3458" (UID: "cec43bff-ec9c-4c1f-975c-85c3292c3458"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.560344 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.569218 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.605808 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cec43bff-ec9c-4c1f-975c-85c3292c3458" (UID: "cec43bff-ec9c-4c1f-975c-85c3292c3458"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.670736 4574 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec43bff-ec9c-4c1f-975c-85c3292c3458-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.717803 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.719906 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.719972 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgzwb" event={"ID":"176077ad-74a8-403d-8917-8288171aa8d4","Type":"ContainerStarted","Data":"b5bc0816bda01a11779bae4bfcb630b4d8575839990112001708fd88fccc2930"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.720184 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.720585 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tr5bq" event={"ID":"4e5fe7de-58bc-4111-97a0-662294fd048b","Type":"ContainerStarted","Data":"2fe134209db305cf7038719f746d27195495f323537eb835f7f6be0597a701d9"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.720646 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.726850 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.771843 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766d778598-9bz6b" event={"ID":"c224adb6-7a04-4bd4-bc6a-d8c484c8710e","Type":"ContainerStarted","Data":"3acee4e05898be2163538592841397ff8c8294b9d059208a8b3e63eade9e6f4a"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773636 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532cf81d-d285-432f-a77e-4b1b04f4388c-logs\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773689 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773720 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773785 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-scripts\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773808 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrj4k\" (UniqueName: \"kubernetes.io/projected/532cf81d-d285-432f-a77e-4b1b04f4388c-kube-api-access-lrj4k\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773861 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data-custom\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.773898 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/532cf81d-d285-432f-a77e-4b1b04f4388c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.781140 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" event={"ID":"5438cd90-23bc-4da2-8856-519b7656f8ff","Type":"ContainerStarted","Data":"090f96c37b28efa226266c8958a746e6b320dbfb23362e21e2d52c4fcf7b0caa"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.799710 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerStarted","Data":"b15ae781cf81c396c20707d63ed388bb2682c5437fba92e132db24511c2e539d"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.815505 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86595bb85-v84cq" event={"ID":"857fe45e-27ff-44ef-b58c-9e1278946927","Type":"ContainerStarted","Data":"b3b7572910d77f8bfb9a78e91eb1a13c4422e56823a27230513a191a2fca7840"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.832346 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7688fc9d67-qlxww" event={"ID":"710de145-ae9a-41bf-9b90-564a1e4acee6","Type":"ContainerStarted","Data":"aa972331ccb05c02baf597e4a49c7170a2b6c28ea4cd01a504d24dba605baafa"} Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.832425 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fdd4f7798-vj9tl"] Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.832716 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerName="dnsmasq-dns" containerID="cri-o://74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c" gracePeriod=10 Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.838756 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6fdd4f7798-vj9tl"] Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.854490 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-86595bb85-v84cq" podStartSLOduration=4.801201709 podStartE2EDuration="19.854471617s" podCreationTimestamp="2025-10-04 05:05:46 +0000 UTC" firstStartedPulling="2025-10-04 05:05:47.500460681 +0000 UTC m=+1173.354603723" lastFinishedPulling="2025-10-04 05:06:02.553730589 +0000 UTC m=+1188.407873631" observedRunningTime="2025-10-04 05:06:05.851650596 +0000 UTC m=+1191.705793638" watchObservedRunningTime="2025-10-04 05:06:05.854471617 +0000 UTC m=+1191.708614659" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.857483 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.874831 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.877740 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532cf81d-d285-432f-a77e-4b1b04f4388c-logs\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.877816 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.877851 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.877907 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-scripts\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.877928 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrj4k\" (UniqueName: \"kubernetes.io/projected/532cf81d-d285-432f-a77e-4b1b04f4388c-kube-api-access-lrj4k\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.878000 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data-custom\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.878041 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/532cf81d-d285-432f-a77e-4b1b04f4388c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.878925 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/532cf81d-d285-432f-a77e-4b1b04f4388c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.879560 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532cf81d-d285-432f-a77e-4b1b04f4388c-logs\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.887217 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-scripts\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.961125 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data-custom\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.963364 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.966624 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:05 crc kubenswrapper[4574]: I1004 05:06:05.973503 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrj4k\" (UniqueName: \"kubernetes.io/projected/532cf81d-d285-432f-a77e-4b1b04f4388c-kube-api-access-lrj4k\") pod \"cinder-api-0\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " pod="openstack/cinder-api-0" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.189075 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.734651 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.770849 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec43bff-ec9c-4c1f-975c-85c3292c3458" path="/var/lib/kubelet/pods/cec43bff-ec9c-4c1f-975c-85c3292c3458/volumes" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.810890 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-swift-storage-0\") pod \"c95796cc-d004-4e3c-bfcb-38be356dbf92\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.811338 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-sb\") pod \"c95796cc-d004-4e3c-bfcb-38be356dbf92\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.811443 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l569k\" (UniqueName: \"kubernetes.io/projected/c95796cc-d004-4e3c-bfcb-38be356dbf92-kube-api-access-l569k\") pod \"c95796cc-d004-4e3c-bfcb-38be356dbf92\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.811568 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-config\") pod \"c95796cc-d004-4e3c-bfcb-38be356dbf92\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.811679 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-nb\") pod \"c95796cc-d004-4e3c-bfcb-38be356dbf92\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.811779 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-svc\") pod \"c95796cc-d004-4e3c-bfcb-38be356dbf92\" (UID: \"c95796cc-d004-4e3c-bfcb-38be356dbf92\") " Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.856497 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c95796cc-d004-4e3c-bfcb-38be356dbf92-kube-api-access-l569k" (OuterVolumeSpecName: "kube-api-access-l569k") pod "c95796cc-d004-4e3c-bfcb-38be356dbf92" (UID: "c95796cc-d004-4e3c-bfcb-38be356dbf92"). InnerVolumeSpecName "kube-api-access-l569k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.923754 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l569k\" (UniqueName: \"kubernetes.io/projected/c95796cc-d004-4e3c-bfcb-38be356dbf92-kube-api-access-l569k\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.927077 4574 generic.go:334] "Generic (PLEG): container finished" podID="20ee193e-b13b-4da7-8f59-a438c1c787c7" containerID="070d29639d18fbc298dbaa50fd29971e1b6ee2075d540cc014e5f3e1ea5909d0" exitCode=0 Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.927516 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zrf8p" event={"ID":"20ee193e-b13b-4da7-8f59-a438c1c787c7","Type":"ContainerDied","Data":"070d29639d18fbc298dbaa50fd29971e1b6ee2075d540cc014e5f3e1ea5909d0"} Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.927552 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.934005 4574 generic.go:334] "Generic (PLEG): container finished" podID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerID="74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c" exitCode=0 Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.934114 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" event={"ID":"c95796cc-d004-4e3c-bfcb-38be356dbf92","Type":"ContainerDied","Data":"74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c"} Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.934143 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" event={"ID":"c95796cc-d004-4e3c-bfcb-38be356dbf92","Type":"ContainerDied","Data":"540dc6b4978d474519d8a9b9c9fd64dab17c8a1074462c86955ad2ec756a4a15"} Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.934160 4574 scope.go:117] "RemoveContainer" containerID="74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.934428 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-fgd5b" Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.956779 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d552b9ab-5c77-4524-b268-ea27f9a661c4","Type":"ContainerStarted","Data":"cc9d69b8cc202a502eaf0a4602926de8f6a412f4ad97a4c61b47992de5c97088"} Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.974704 4574 generic.go:334] "Generic (PLEG): container finished" podID="176077ad-74a8-403d-8917-8288171aa8d4" containerID="b672895c479b816bc6bb6c0e871a1c341c7aad69f9ed2cb29f9501975c4cb909" exitCode=0 Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.974775 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgzwb" event={"ID":"176077ad-74a8-403d-8917-8288171aa8d4","Type":"ContainerDied","Data":"b672895c479b816bc6bb6c0e871a1c341c7aad69f9ed2cb29f9501975c4cb909"} Oct 04 05:06:06 crc kubenswrapper[4574]: I1004 05:06:06.979210 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c95796cc-d004-4e3c-bfcb-38be356dbf92" (UID: "c95796cc-d004-4e3c-bfcb-38be356dbf92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.009924 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" event={"ID":"5438cd90-23bc-4da2-8856-519b7656f8ff","Type":"ContainerStarted","Data":"2dd2818fdc4a08fc9425e5cb0a4bebad33d0cd6ca1a89865a39966f4f6260813"} Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.035023 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.035460 4574 scope.go:117] "RemoveContainer" containerID="6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.035752 4574 generic.go:334] "Generic (PLEG): container finished" podID="4e5fe7de-58bc-4111-97a0-662294fd048b" containerID="890e49b885581aef88f08419103b98fbc7609d4becc49ebf9141c939ed7dc9c8" exitCode=0 Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.035818 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tr5bq" event={"ID":"4e5fe7de-58bc-4111-97a0-662294fd048b","Type":"ContainerDied","Data":"890e49b885581aef88f08419103b98fbc7609d4becc49ebf9141c939ed7dc9c8"} Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.073819 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7688fc9d67-qlxww" event={"ID":"710de145-ae9a-41bf-9b90-564a1e4acee6","Type":"ContainerStarted","Data":"a64aa41d4dd8dcf75b51f4c0cceb5c9ab9829b90c0a311711ede6de5361eb032"} Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.074857 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.074888 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.081483 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bcjvm"] Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.093082 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766d778598-9bz6b" event={"ID":"c224adb6-7a04-4bd4-bc6a-d8c484c8710e","Type":"ContainerStarted","Data":"235828b411adf6a6564addf769a055fb9fb25a5e71d0755f1da9ef3f2ee415d9"} Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.093697 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.093756 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.095645 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5475848bb4-qk59c" podStartSLOduration=6.037139654 podStartE2EDuration="21.09561983s" podCreationTimestamp="2025-10-04 05:05:46 +0000 UTC" firstStartedPulling="2025-10-04 05:05:47.496087797 +0000 UTC m=+1173.350230839" lastFinishedPulling="2025-10-04 05:06:02.554567973 +0000 UTC m=+1188.408711015" observedRunningTime="2025-10-04 05:06:07.081981542 +0000 UTC m=+1192.936124584" watchObservedRunningTime="2025-10-04 05:06:07.09561983 +0000 UTC m=+1192.949762862" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.165386 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7688fc9d67-qlxww" podStartSLOduration=18.165361345 podStartE2EDuration="18.165361345s" podCreationTimestamp="2025-10-04 05:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:07.115003762 +0000 UTC m=+1192.969146804" watchObservedRunningTime="2025-10-04 05:06:07.165361345 +0000 UTC m=+1193.019504387" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.187841 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c95796cc-d004-4e3c-bfcb-38be356dbf92" (UID: "c95796cc-d004-4e3c-bfcb-38be356dbf92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.230007 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-766d778598-9bz6b" podStartSLOduration=17.229986095 podStartE2EDuration="17.229986095s" podCreationTimestamp="2025-10-04 05:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:07.191941452 +0000 UTC m=+1193.046084494" watchObservedRunningTime="2025-10-04 05:06:07.229986095 +0000 UTC m=+1193.084129147" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.237073 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.240431 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.246476 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c95796cc-d004-4e3c-bfcb-38be356dbf92" (UID: "c95796cc-d004-4e3c-bfcb-38be356dbf92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.249925 4574 scope.go:117] "RemoveContainer" containerID="74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c" Oct 04 05:06:07 crc kubenswrapper[4574]: E1004 05:06:07.250992 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c\": container with ID starting with 74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c not found: ID does not exist" containerID="74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.251034 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c"} err="failed to get container status \"74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c\": rpc error: code = NotFound desc = could not find container \"74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c\": container with ID starting with 74967396c58761401cd230d83b56d936bb47bae7623e9b2aa070d9ecfd90053c not found: ID does not exist" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.251064 4574 scope.go:117] "RemoveContainer" containerID="6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13" Oct 04 05:06:07 crc kubenswrapper[4574]: E1004 05:06:07.252473 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13\": container with ID starting with 6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13 not found: ID does not exist" containerID="6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.252505 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13"} err="failed to get container status \"6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13\": rpc error: code = NotFound desc = could not find container \"6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13\": container with ID starting with 6ea50d0548a6112a0781d86161b51d438130305dcce349453a455c1f53c5dd13 not found: ID does not exist" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.298255 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.332792 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c95796cc-d004-4e3c-bfcb-38be356dbf92" (UID: "c95796cc-d004-4e3c-bfcb-38be356dbf92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.373039 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.373406 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.391381 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-config" (OuterVolumeSpecName: "config") pod "c95796cc-d004-4e3c-bfcb-38be356dbf92" (UID: "c95796cc-d004-4e3c-bfcb-38be356dbf92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.477115 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c95796cc-d004-4e3c-bfcb-38be356dbf92-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.643363 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-fgd5b"] Oct 04 05:06:07 crc kubenswrapper[4574]: I1004 05:06:07.652099 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-fgd5b"] Oct 04 05:06:08 crc kubenswrapper[4574]: I1004 05:06:08.122980 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerStarted","Data":"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384"} Oct 04 05:06:08 crc kubenswrapper[4574]: I1004 05:06:08.130119 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" event={"ID":"b6ee152d-8343-47d6-8a16-cfed435bee04","Type":"ContainerStarted","Data":"076c21926a5e0e9e76fd035eccb641531423e14ef4b74cdcb8e0663099888c15"} Oct 04 05:06:08 crc kubenswrapper[4574]: I1004 05:06:08.150871 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"532cf81d-d285-432f-a77e-4b1b04f4388c","Type":"ContainerStarted","Data":"85d04f5fb8160911dfd6995bcdd7557645a153709c32c83e6994499fdada5b93"} Oct 04 05:06:08 crc kubenswrapper[4574]: I1004 05:06:08.793091 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" path="/var/lib/kubelet/pods/c95796cc-d004-4e3c-bfcb-38be356dbf92/volumes" Oct 04 05:06:08 crc kubenswrapper[4574]: I1004 05:06:08.915302 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.023519 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnblx\" (UniqueName: \"kubernetes.io/projected/176077ad-74a8-403d-8917-8288171aa8d4-kube-api-access-vnblx\") pod \"176077ad-74a8-403d-8917-8288171aa8d4\" (UID: \"176077ad-74a8-403d-8917-8288171aa8d4\") " Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.049461 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/176077ad-74a8-403d-8917-8288171aa8d4-kube-api-access-vnblx" (OuterVolumeSpecName: "kube-api-access-vnblx") pod "176077ad-74a8-403d-8917-8288171aa8d4" (UID: "176077ad-74a8-403d-8917-8288171aa8d4"). InnerVolumeSpecName "kube-api-access-vnblx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.142092 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnblx\" (UniqueName: \"kubernetes.io/projected/176077ad-74a8-403d-8917-8288171aa8d4-kube-api-access-vnblx\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.158615 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.214727 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.273987 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerStarted","Data":"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79"} Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.298952 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tr5bq" event={"ID":"4e5fe7de-58bc-4111-97a0-662294fd048b","Type":"ContainerDied","Data":"2fe134209db305cf7038719f746d27195495f323537eb835f7f6be0597a701d9"} Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.298996 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fe134209db305cf7038719f746d27195495f323537eb835f7f6be0597a701d9" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.299054 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tr5bq" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.337622 4574 generic.go:334] "Generic (PLEG): container finished" podID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerID="c6e7364d21fdb211214c80ddd787a14c2c37d0919f4cd13c3a74b29feb66d9b4" exitCode=0 Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.338804 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" event={"ID":"b6ee152d-8343-47d6-8a16-cfed435bee04","Type":"ContainerDied","Data":"c6e7364d21fdb211214c80ddd787a14c2c37d0919f4cd13c3a74b29feb66d9b4"} Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.348861 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptq7t\" (UniqueName: \"kubernetes.io/projected/4e5fe7de-58bc-4111-97a0-662294fd048b-kube-api-access-ptq7t\") pod \"4e5fe7de-58bc-4111-97a0-662294fd048b\" (UID: \"4e5fe7de-58bc-4111-97a0-662294fd048b\") " Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.348972 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8lmh\" (UniqueName: \"kubernetes.io/projected/20ee193e-b13b-4da7-8f59-a438c1c787c7-kube-api-access-v8lmh\") pod \"20ee193e-b13b-4da7-8f59-a438c1c787c7\" (UID: \"20ee193e-b13b-4da7-8f59-a438c1c787c7\") " Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.364467 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5fe7de-58bc-4111-97a0-662294fd048b-kube-api-access-ptq7t" (OuterVolumeSpecName: "kube-api-access-ptq7t") pod "4e5fe7de-58bc-4111-97a0-662294fd048b" (UID: "4e5fe7de-58bc-4111-97a0-662294fd048b"). InnerVolumeSpecName "kube-api-access-ptq7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.365503 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ee193e-b13b-4da7-8f59-a438c1c787c7-kube-api-access-v8lmh" (OuterVolumeSpecName: "kube-api-access-v8lmh") pod "20ee193e-b13b-4da7-8f59-a438c1c787c7" (UID: "20ee193e-b13b-4da7-8f59-a438c1c787c7"). InnerVolumeSpecName "kube-api-access-v8lmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.397660 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zrf8p" event={"ID":"20ee193e-b13b-4da7-8f59-a438c1c787c7","Type":"ContainerDied","Data":"57f758ae17c3ca213b3717aa90cec1f4068d8aaf3633774198c5f3e8c69e4d3f"} Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.397972 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f758ae17c3ca213b3717aa90cec1f4068d8aaf3633774198c5f3e8c69e4d3f" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.399523 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrf8p" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.439510 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"532cf81d-d285-432f-a77e-4b1b04f4388c","Type":"ContainerStarted","Data":"de48f94498070689a4fb364de069bc21c63e13851065e9aafa43e5ffbfe49078"} Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.445599 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgzwb" event={"ID":"176077ad-74a8-403d-8917-8288171aa8d4","Type":"ContainerDied","Data":"b5bc0816bda01a11779bae4bfcb630b4d8575839990112001708fd88fccc2930"} Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.445684 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5bc0816bda01a11779bae4bfcb630b4d8575839990112001708fd88fccc2930" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.448805 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgzwb" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.453664 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptq7t\" (UniqueName: \"kubernetes.io/projected/4e5fe7de-58bc-4111-97a0-662294fd048b-kube-api-access-ptq7t\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.453685 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8lmh\" (UniqueName: \"kubernetes.io/projected/20ee193e-b13b-4da7-8f59-a438c1c787c7-kube-api-access-v8lmh\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.711446 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.712559 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.840463 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:06:09 crc kubenswrapper[4574]: I1004 05:06:09.843572 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:06:10 crc kubenswrapper[4574]: I1004 05:06:10.073262 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7688fc9d67-qlxww" podUID="710de145-ae9a-41bf-9b90-564a1e4acee6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 04 05:06:10 crc kubenswrapper[4574]: I1004 05:06:10.506400 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" event={"ID":"b6ee152d-8343-47d6-8a16-cfed435bee04","Type":"ContainerStarted","Data":"f1e5f87f4d50caabc590bc71bccd0a5aef1a0c7a023ce2635e6eedb76b2b112a"} Oct 04 05:06:10 crc kubenswrapper[4574]: I1004 05:06:10.507985 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:10 crc kubenswrapper[4574]: I1004 05:06:10.528680 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d552b9ab-5c77-4524-b268-ea27f9a661c4","Type":"ContainerStarted","Data":"d0e7b99d7a00461a72b61d28a9ac2d9573a5ae27a9cd9abd5ab9c3411bcc8126"} Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.054287 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" podStartSLOduration=7.054272268 podStartE2EDuration="7.054272268s" podCreationTimestamp="2025-10-04 05:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:10.570000921 +0000 UTC m=+1196.424143963" watchObservedRunningTime="2025-10-04 05:06:11.054272268 +0000 UTC m=+1196.908415310" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061222 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-86ce-account-create-s4xn9"] Oct 04 05:06:11 crc kubenswrapper[4574]: E1004 05:06:11.061581 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ee193e-b13b-4da7-8f59-a438c1c787c7" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061596 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ee193e-b13b-4da7-8f59-a438c1c787c7" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: E1004 05:06:11.061606 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="176077ad-74a8-403d-8917-8288171aa8d4" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061613 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="176077ad-74a8-403d-8917-8288171aa8d4" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: E1004 05:06:11.061636 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerName="init" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061643 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerName="init" Oct 04 05:06:11 crc kubenswrapper[4574]: E1004 05:06:11.061666 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerName="dnsmasq-dns" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061671 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerName="dnsmasq-dns" Oct 04 05:06:11 crc kubenswrapper[4574]: E1004 05:06:11.061682 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5fe7de-58bc-4111-97a0-662294fd048b" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061688 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5fe7de-58bc-4111-97a0-662294fd048b" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061841 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="176077ad-74a8-403d-8917-8288171aa8d4" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061861 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="c95796cc-d004-4e3c-bfcb-38be356dbf92" containerName="dnsmasq-dns" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061874 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5fe7de-58bc-4111-97a0-662294fd048b" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.061888 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ee193e-b13b-4da7-8f59-a438c1c787c7" containerName="mariadb-database-create" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.062485 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.065574 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.078986 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-86ce-account-create-s4xn9"] Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.212841 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqc7\" (UniqueName: \"kubernetes.io/projected/a0388204-4e2e-4a49-b47a-ef648fba57e8-kube-api-access-kdqc7\") pod \"nova-api-86ce-account-create-s4xn9\" (UID: \"a0388204-4e2e-4a49-b47a-ef648fba57e8\") " pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.315747 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqc7\" (UniqueName: \"kubernetes.io/projected/a0388204-4e2e-4a49-b47a-ef648fba57e8-kube-api-access-kdqc7\") pod \"nova-api-86ce-account-create-s4xn9\" (UID: \"a0388204-4e2e-4a49-b47a-ef648fba57e8\") " pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.335407 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqc7\" (UniqueName: \"kubernetes.io/projected/a0388204-4e2e-4a49-b47a-ef648fba57e8-kube-api-access-kdqc7\") pod \"nova-api-86ce-account-create-s4xn9\" (UID: \"a0388204-4e2e-4a49-b47a-ef648fba57e8\") " pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.384648 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.557770 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"532cf81d-d285-432f-a77e-4b1b04f4388c","Type":"ContainerStarted","Data":"5822fae071cbbe935e582094b21ed25c939bd3f5fa1945a0c757be037c4c79a5"} Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.557970 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api-log" containerID="cri-o://de48f94498070689a4fb364de069bc21c63e13851065e9aafa43e5ffbfe49078" gracePeriod=30 Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.558319 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.558667 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api" containerID="cri-o://5822fae071cbbe935e582094b21ed25c939bd3f5fa1945a0c757be037c4c79a5" gracePeriod=30 Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.592268 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d552b9ab-5c77-4524-b268-ea27f9a661c4","Type":"ContainerStarted","Data":"14e3cf87af38427fbca34d9160e948e91e0ea7040d8e15302a99168afb46a53c"} Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.598932 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerStarted","Data":"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b"} Oct 04 05:06:11 crc kubenswrapper[4574]: I1004 05:06:11.653474 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.653450616 podStartE2EDuration="7.653450616s" podCreationTimestamp="2025-10-04 05:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:11.5907095 +0000 UTC m=+1197.444852542" watchObservedRunningTime="2025-10-04 05:06:11.653450616 +0000 UTC m=+1197.507593668" Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.322800 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.283498394 podStartE2EDuration="8.322778211s" podCreationTimestamp="2025-10-04 05:06:04 +0000 UTC" firstStartedPulling="2025-10-04 05:06:06.856897784 +0000 UTC m=+1192.711040826" lastFinishedPulling="2025-10-04 05:06:07.896177601 +0000 UTC m=+1193.750320643" observedRunningTime="2025-10-04 05:06:11.652135388 +0000 UTC m=+1197.506278440" watchObservedRunningTime="2025-10-04 05:06:12.322778211 +0000 UTC m=+1198.176921253" Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.325958 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-86ce-account-create-s4xn9"] Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.610401 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerStarted","Data":"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5"} Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.610830 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.614066 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86ce-account-create-s4xn9" event={"ID":"a0388204-4e2e-4a49-b47a-ef648fba57e8","Type":"ContainerStarted","Data":"89dcdeb9343cf36855d68c34da4b728dd135daa9de25423e41289ba1d794c24b"} Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.614103 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86ce-account-create-s4xn9" event={"ID":"a0388204-4e2e-4a49-b47a-ef648fba57e8","Type":"ContainerStarted","Data":"f62734f6d7fc94928d6f7c5a67f6e7a3445f5acb575e67004ade38f89acb829f"} Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.617411 4574 generic.go:334] "Generic (PLEG): container finished" podID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerID="de48f94498070689a4fb364de069bc21c63e13851065e9aafa43e5ffbfe49078" exitCode=143 Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.617651 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"532cf81d-d285-432f-a77e-4b1b04f4388c","Type":"ContainerDied","Data":"de48f94498070689a4fb364de069bc21c63e13851065e9aafa43e5ffbfe49078"} Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.650865 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.337457627 podStartE2EDuration="9.650848641s" podCreationTimestamp="2025-10-04 05:06:03 +0000 UTC" firstStartedPulling="2025-10-04 05:06:05.525763039 +0000 UTC m=+1191.379906091" lastFinishedPulling="2025-10-04 05:06:11.839154063 +0000 UTC m=+1197.693297105" observedRunningTime="2025-10-04 05:06:12.650612494 +0000 UTC m=+1198.504755546" watchObservedRunningTime="2025-10-04 05:06:12.650848641 +0000 UTC m=+1198.504991683" Oct 04 05:06:12 crc kubenswrapper[4574]: I1004 05:06:12.682296 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-86ce-account-create-s4xn9" podStartSLOduration=1.6822738849999999 podStartE2EDuration="1.682273885s" podCreationTimestamp="2025-10-04 05:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:12.680538066 +0000 UTC m=+1198.534681128" watchObservedRunningTime="2025-10-04 05:06:12.682273885 +0000 UTC m=+1198.536416927" Oct 04 05:06:13 crc kubenswrapper[4574]: I1004 05:06:13.629223 4574 generic.go:334] "Generic (PLEG): container finished" podID="a0388204-4e2e-4a49-b47a-ef648fba57e8" containerID="89dcdeb9343cf36855d68c34da4b728dd135daa9de25423e41289ba1d794c24b" exitCode=0 Oct 04 05:06:13 crc kubenswrapper[4574]: I1004 05:06:13.631870 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86ce-account-create-s4xn9" event={"ID":"a0388204-4e2e-4a49-b47a-ef648fba57e8","Type":"ContainerDied","Data":"89dcdeb9343cf36855d68c34da4b728dd135daa9de25423e41289ba1d794c24b"} Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.084198 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.086336 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7688fc9d67-qlxww" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.386856 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.485013 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdqc7\" (UniqueName: \"kubernetes.io/projected/a0388204-4e2e-4a49-b47a-ef648fba57e8-kube-api-access-kdqc7\") pod \"a0388204-4e2e-4a49-b47a-ef648fba57e8\" (UID: \"a0388204-4e2e-4a49-b47a-ef648fba57e8\") " Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.493969 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0388204-4e2e-4a49-b47a-ef648fba57e8-kube-api-access-kdqc7" (OuterVolumeSpecName: "kube-api-access-kdqc7") pod "a0388204-4e2e-4a49-b47a-ef648fba57e8" (UID: "a0388204-4e2e-4a49-b47a-ef648fba57e8"). InnerVolumeSpecName "kube-api-access-kdqc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.572661 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.587786 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdqc7\" (UniqueName: \"kubernetes.io/projected/a0388204-4e2e-4a49-b47a-ef648fba57e8-kube-api-access-kdqc7\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.658903 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-86ce-account-create-s4xn9" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.659450 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-86ce-account-create-s4xn9" event={"ID":"a0388204-4e2e-4a49-b47a-ef648fba57e8","Type":"ContainerDied","Data":"f62734f6d7fc94928d6f7c5a67f6e7a3445f5acb575e67004ade38f89acb829f"} Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.659488 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f62734f6d7fc94928d6f7c5a67f6e7a3445f5acb575e67004ade38f89acb829f" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.858862 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.859855 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.171:8080/\": dial tcp 10.217.0.171:8080: connect: connection refused" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.878448 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.995799 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rqc9t"] Oct 04 05:06:15 crc kubenswrapper[4574]: I1004 05:06:15.996056 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerName="dnsmasq-dns" containerID="cri-o://73c876e0484d654a09157e18cac7986798e6b0b7d8859389c8ddb9f3bcb29186" gracePeriod=10 Oct 04 05:06:16 crc kubenswrapper[4574]: I1004 05:06:16.570424 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:16 crc kubenswrapper[4574]: I1004 05:06:16.571041 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:16 crc kubenswrapper[4574]: I1004 05:06:16.687760 4574 generic.go:334] "Generic (PLEG): container finished" podID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerID="73c876e0484d654a09157e18cac7986798e6b0b7d8859389c8ddb9f3bcb29186" exitCode=0 Oct 04 05:06:16 crc kubenswrapper[4574]: I1004 05:06:16.687871 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" event={"ID":"e598d7f1-865d-47bf-9263-ca027b3c92c9","Type":"ContainerDied","Data":"73c876e0484d654a09157e18cac7986798e6b0b7d8859389c8ddb9f3bcb29186"} Oct 04 05:06:16 crc kubenswrapper[4574]: I1004 05:06:16.696373 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2552db74-0d8b-4ca0-af2e-092c03e097f2","Type":"ContainerStarted","Data":"6ae8458f269d1256aac66ecdcd9b2f6d61930c5abf659c2b6e9d1da89f6459b1"} Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.161190 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.190089 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.087264345 podStartE2EDuration="38.190064157s" podCreationTimestamp="2025-10-04 05:05:39 +0000 UTC" firstStartedPulling="2025-10-04 05:05:41.057120708 +0000 UTC m=+1166.911263750" lastFinishedPulling="2025-10-04 05:06:16.15992052 +0000 UTC m=+1202.014063562" observedRunningTime="2025-10-04 05:06:16.724143983 +0000 UTC m=+1202.578287035" watchObservedRunningTime="2025-10-04 05:06:17.190064157 +0000 UTC m=+1203.044207199" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.229163 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-swift-storage-0\") pod \"e598d7f1-865d-47bf-9263-ca027b3c92c9\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.229334 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-sb\") pod \"e598d7f1-865d-47bf-9263-ca027b3c92c9\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.229556 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-svc\") pod \"e598d7f1-865d-47bf-9263-ca027b3c92c9\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.229620 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-nb\") pod \"e598d7f1-865d-47bf-9263-ca027b3c92c9\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.229679 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwsnv\" (UniqueName: \"kubernetes.io/projected/e598d7f1-865d-47bf-9263-ca027b3c92c9-kube-api-access-hwsnv\") pod \"e598d7f1-865d-47bf-9263-ca027b3c92c9\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.229715 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-config\") pod \"e598d7f1-865d-47bf-9263-ca027b3c92c9\" (UID: \"e598d7f1-865d-47bf-9263-ca027b3c92c9\") " Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.254212 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e598d7f1-865d-47bf-9263-ca027b3c92c9-kube-api-access-hwsnv" (OuterVolumeSpecName: "kube-api-access-hwsnv") pod "e598d7f1-865d-47bf-9263-ca027b3c92c9" (UID: "e598d7f1-865d-47bf-9263-ca027b3c92c9"). InnerVolumeSpecName "kube-api-access-hwsnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.330988 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwsnv\" (UniqueName: \"kubernetes.io/projected/e598d7f1-865d-47bf-9263-ca027b3c92c9-kube-api-access-hwsnv\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.408248 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e598d7f1-865d-47bf-9263-ca027b3c92c9" (UID: "e598d7f1-865d-47bf-9263-ca027b3c92c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.432745 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.440803 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e598d7f1-865d-47bf-9263-ca027b3c92c9" (UID: "e598d7f1-865d-47bf-9263-ca027b3c92c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.460680 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-config" (OuterVolumeSpecName: "config") pod "e598d7f1-865d-47bf-9263-ca027b3c92c9" (UID: "e598d7f1-865d-47bf-9263-ca027b3c92c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.493154 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e598d7f1-865d-47bf-9263-ca027b3c92c9" (UID: "e598d7f1-865d-47bf-9263-ca027b3c92c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.500661 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e598d7f1-865d-47bf-9263-ca027b3c92c9" (UID: "e598d7f1-865d-47bf-9263-ca027b3c92c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.534066 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.534102 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.534119 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.534132 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e598d7f1-865d-47bf-9263-ca027b3c92c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.706848 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" event={"ID":"e598d7f1-865d-47bf-9263-ca027b3c92c9","Type":"ContainerDied","Data":"d724b7374a7b2bc0a76bb605c1a53317c993cf374bec222e7f0d8c61d16d4a97"} Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.706923 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rqc9t" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.706937 4574 scope.go:117] "RemoveContainer" containerID="73c876e0484d654a09157e18cac7986798e6b0b7d8859389c8ddb9f3bcb29186" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.744768 4574 scope.go:117] "RemoveContainer" containerID="91613fe8fee7919f3cfff79db0d7ce3e20fc9185cede4edd77893dceaf30cec0" Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.759386 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rqc9t"] Oct 04 05:06:17 crc kubenswrapper[4574]: I1004 05:06:17.775590 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rqc9t"] Oct 04 05:06:18 crc kubenswrapper[4574]: I1004 05:06:18.762924 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" path="/var/lib/kubelet/pods/e598d7f1-865d-47bf-9263-ca027b3c92c9/volumes" Oct 04 05:06:19 crc kubenswrapper[4574]: I1004 05:06:19.711696 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:06:19 crc kubenswrapper[4574]: I1004 05:06:19.842269 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:06:20 crc kubenswrapper[4574]: I1004 05:06:20.567531 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:20 crc kubenswrapper[4574]: I1004 05:06:20.577500 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:20 crc kubenswrapper[4574]: I1004 05:06:20.672504 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.202969 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b24d-account-create-4x5bk"] Oct 04 05:06:21 crc kubenswrapper[4574]: E1004 05:06:21.204132 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0388204-4e2e-4a49-b47a-ef648fba57e8" containerName="mariadb-account-create" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.204178 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0388204-4e2e-4a49-b47a-ef648fba57e8" containerName="mariadb-account-create" Oct 04 05:06:21 crc kubenswrapper[4574]: E1004 05:06:21.204204 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerName="init" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.204216 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerName="init" Oct 04 05:06:21 crc kubenswrapper[4574]: E1004 05:06:21.204250 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerName="dnsmasq-dns" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.204261 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerName="dnsmasq-dns" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.204552 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0388204-4e2e-4a49-b47a-ef648fba57e8" containerName="mariadb-account-create" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.204574 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e598d7f1-865d-47bf-9263-ca027b3c92c9" containerName="dnsmasq-dns" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.205616 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.207159 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.222400 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b24d-account-create-4x5bk"] Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.231388 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.173:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.313291 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7129-account-create-v7qwd"] Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.314536 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.316442 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.321280 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwgs\" (UniqueName: \"kubernetes.io/projected/e16c650e-b479-4611-b9cf-2085522591c6-kube-api-access-5zwgs\") pod \"nova-cell0-b24d-account-create-4x5bk\" (UID: \"e16c650e-b479-4611-b9cf-2085522591c6\") " pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.324793 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7129-account-create-v7qwd"] Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.424014 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb8wb\" (UniqueName: \"kubernetes.io/projected/270e012c-9f36-48e8-8485-b38005557964-kube-api-access-lb8wb\") pod \"nova-cell1-7129-account-create-v7qwd\" (UID: \"270e012c-9f36-48e8-8485-b38005557964\") " pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.424541 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwgs\" (UniqueName: \"kubernetes.io/projected/e16c650e-b479-4611-b9cf-2085522591c6-kube-api-access-5zwgs\") pod \"nova-cell0-b24d-account-create-4x5bk\" (UID: \"e16c650e-b479-4611-b9cf-2085522591c6\") " pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.464708 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwgs\" (UniqueName: \"kubernetes.io/projected/e16c650e-b479-4611-b9cf-2085522591c6-kube-api-access-5zwgs\") pod \"nova-cell0-b24d-account-create-4x5bk\" (UID: \"e16c650e-b479-4611-b9cf-2085522591c6\") " pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.525990 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.528741 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb8wb\" (UniqueName: \"kubernetes.io/projected/270e012c-9f36-48e8-8485-b38005557964-kube-api-access-lb8wb\") pod \"nova-cell1-7129-account-create-v7qwd\" (UID: \"270e012c-9f36-48e8-8485-b38005557964\") " pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.546339 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb8wb\" (UniqueName: \"kubernetes.io/projected/270e012c-9f36-48e8-8485-b38005557964-kube-api-access-lb8wb\") pod \"nova-cell1-7129-account-create-v7qwd\" (UID: \"270e012c-9f36-48e8-8485-b38005557964\") " pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.589365 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.589763 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-766d778598-9bz6b" podUID="c224adb6-7a04-4bd4-bc6a-d8c484c8710e" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 05:06:21 crc kubenswrapper[4574]: I1004 05:06:21.650934 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:22 crc kubenswrapper[4574]: I1004 05:06:22.347736 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b24d-account-create-4x5bk"] Oct 04 05:06:22 crc kubenswrapper[4574]: I1004 05:06:22.540901 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7129-account-create-v7qwd"] Oct 04 05:06:22 crc kubenswrapper[4574]: I1004 05:06:22.800115 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 04 05:06:22 crc kubenswrapper[4574]: I1004 05:06:22.889402 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b24d-account-create-4x5bk" event={"ID":"e16c650e-b479-4611-b9cf-2085522591c6","Type":"ContainerStarted","Data":"4f07ac6467880fa375073ad52f838def7a6b10d5078b3482d07c5caa89f157f5"} Oct 04 05:06:22 crc kubenswrapper[4574]: I1004 05:06:22.906378 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7129-account-create-v7qwd" event={"ID":"270e012c-9f36-48e8-8485-b38005557964","Type":"ContainerStarted","Data":"4debfbd3e28a44436e2bee3d2cb84309c99b9d82c4b5bc5d68570f322a120f96"} Oct 04 05:06:23 crc kubenswrapper[4574]: I1004 05:06:23.916635 4574 generic.go:334] "Generic (PLEG): container finished" podID="e16c650e-b479-4611-b9cf-2085522591c6" containerID="215010902fd4fbb967a64b74964562eae1dafab00d2d21ac10e6a2f4f4096bcf" exitCode=0 Oct 04 05:06:23 crc kubenswrapper[4574]: I1004 05:06:23.916798 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b24d-account-create-4x5bk" event={"ID":"e16c650e-b479-4611-b9cf-2085522591c6","Type":"ContainerDied","Data":"215010902fd4fbb967a64b74964562eae1dafab00d2d21ac10e6a2f4f4096bcf"} Oct 04 05:06:23 crc kubenswrapper[4574]: I1004 05:06:23.918138 4574 generic.go:334] "Generic (PLEG): container finished" podID="270e012c-9f36-48e8-8485-b38005557964" containerID="8320ead7ee0e60c7cc47942a2b77889a400c7dad2bc1c673bd5b8cfd96dc073d" exitCode=0 Oct 04 05:06:23 crc kubenswrapper[4574]: I1004 05:06:23.918163 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7129-account-create-v7qwd" event={"ID":"270e012c-9f36-48e8-8485-b38005557964","Type":"ContainerDied","Data":"8320ead7ee0e60c7cc47942a2b77889a400c7dad2bc1c673bd5b8cfd96dc073d"} Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.417756 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-766d778598-9bz6b" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.612597 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-848d9d6b7d-xxqvb"] Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.612900 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-848d9d6b7d-xxqvb" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api-log" containerID="cri-o://2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37" gracePeriod=30 Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.613131 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-848d9d6b7d-xxqvb" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api" containerID="cri-o://431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858" gracePeriod=30 Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.621367 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.634110 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.698026 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb8wb\" (UniqueName: \"kubernetes.io/projected/270e012c-9f36-48e8-8485-b38005557964-kube-api-access-lb8wb\") pod \"270e012c-9f36-48e8-8485-b38005557964\" (UID: \"270e012c-9f36-48e8-8485-b38005557964\") " Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.704370 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zwgs\" (UniqueName: \"kubernetes.io/projected/e16c650e-b479-4611-b9cf-2085522591c6-kube-api-access-5zwgs\") pod \"e16c650e-b479-4611-b9cf-2085522591c6\" (UID: \"e16c650e-b479-4611-b9cf-2085522591c6\") " Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.750473 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16c650e-b479-4611-b9cf-2085522591c6-kube-api-access-5zwgs" (OuterVolumeSpecName: "kube-api-access-5zwgs") pod "e16c650e-b479-4611-b9cf-2085522591c6" (UID: "e16c650e-b479-4611-b9cf-2085522591c6"). InnerVolumeSpecName "kube-api-access-5zwgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.757587 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270e012c-9f36-48e8-8485-b38005557964-kube-api-access-lb8wb" (OuterVolumeSpecName: "kube-api-access-lb8wb") pod "270e012c-9f36-48e8-8485-b38005557964" (UID: "270e012c-9f36-48e8-8485-b38005557964"). InnerVolumeSpecName "kube-api-access-lb8wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.810766 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb8wb\" (UniqueName: \"kubernetes.io/projected/270e012c-9f36-48e8-8485-b38005557964-kube-api-access-lb8wb\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.810802 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zwgs\" (UniqueName: \"kubernetes.io/projected/e16c650e-b479-4611-b9cf-2085522591c6-kube-api-access-5zwgs\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.864553 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.956538 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.985657 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b24d-account-create-4x5bk" event={"ID":"e16c650e-b479-4611-b9cf-2085522591c6","Type":"ContainerDied","Data":"4f07ac6467880fa375073ad52f838def7a6b10d5078b3482d07c5caa89f157f5"} Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.985703 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f07ac6467880fa375073ad52f838def7a6b10d5078b3482d07c5caa89f157f5" Oct 04 05:06:25 crc kubenswrapper[4574]: I1004 05:06:25.985768 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b24d-account-create-4x5bk" Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.031368 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7129-account-create-v7qwd" Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.031410 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7129-account-create-v7qwd" event={"ID":"270e012c-9f36-48e8-8485-b38005557964","Type":"ContainerDied","Data":"4debfbd3e28a44436e2bee3d2cb84309c99b9d82c4b5bc5d68570f322a120f96"} Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.031438 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4debfbd3e28a44436e2bee3d2cb84309c99b9d82c4b5bc5d68570f322a120f96" Oct 04 05:06:26 crc kubenswrapper[4574]: E1004 05:06:26.035910 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871e935e_c9ec_4798_a014_6a9fe6cd1fbd.slice/crio-2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871e935e_c9ec_4798_a014_6a9fe6cd1fbd.slice/crio-conmon-2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.065756 4574 generic.go:334] "Generic (PLEG): container finished" podID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerID="2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37" exitCode=143 Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.065969 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="cinder-scheduler" containerID="cri-o://d0e7b99d7a00461a72b61d28a9ac2d9573a5ae27a9cd9abd5ab9c3411bcc8126" gracePeriod=30 Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.066958 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-848d9d6b7d-xxqvb" event={"ID":"871e935e-c9ec-4798-a014-6a9fe6cd1fbd","Type":"ContainerDied","Data":"2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37"} Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.067222 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="probe" containerID="cri-o://14e3cf87af38427fbca34d9160e948e91e0ea7040d8e15302a99168afb46a53c" gracePeriod=30 Oct 04 05:06:26 crc kubenswrapper[4574]: I1004 05:06:26.168937 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 04 05:06:27 crc kubenswrapper[4574]: I1004 05:06:27.077438 4574 generic.go:334] "Generic (PLEG): container finished" podID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerID="14e3cf87af38427fbca34d9160e948e91e0ea7040d8e15302a99168afb46a53c" exitCode=0 Oct 04 05:06:27 crc kubenswrapper[4574]: I1004 05:06:27.077505 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d552b9ab-5c77-4524-b268-ea27f9a661c4","Type":"ContainerDied","Data":"14e3cf87af38427fbca34d9160e948e91e0ea7040d8e15302a99168afb46a53c"} Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.354297 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-848d9d6b7d-xxqvb" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:50162->10.217.0.164:9311: read: connection reset by peer" Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.354858 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-848d9d6b7d-xxqvb" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:50166->10.217.0.164:9311: read: connection reset by peer" Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.379762 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.380104 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-central-agent" containerID="cri-o://c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" gracePeriod=30 Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.380539 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="proxy-httpd" containerID="cri-o://8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" gracePeriod=30 Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.380595 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="sg-core" containerID="cri-o://5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" gracePeriod=30 Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.380700 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-notification-agent" containerID="cri-o://3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" gracePeriod=30 Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.726367 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.815382 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": read tcp 10.217.0.2:42834->10.217.0.170:3000: read: connection reset by peer" Oct 04 05:06:29 crc kubenswrapper[4574]: I1004 05:06:29.861440 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.090826 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.107802 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-logs\") pod \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.108007 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data-custom\") pod \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.108090 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data\") pod \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.108117 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-combined-ca-bundle\") pod \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.108188 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlccw\" (UniqueName: \"kubernetes.io/projected/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-kube-api-access-hlccw\") pod \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\" (UID: \"871e935e-c9ec-4798-a014-6a9fe6cd1fbd\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.108878 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-logs" (OuterVolumeSpecName: "logs") pod "871e935e-c9ec-4798-a014-6a9fe6cd1fbd" (UID: "871e935e-c9ec-4798-a014-6a9fe6cd1fbd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.123940 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "871e935e-c9ec-4798-a014-6a9fe6cd1fbd" (UID: "871e935e-c9ec-4798-a014-6a9fe6cd1fbd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.130510 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-kube-api-access-hlccw" (OuterVolumeSpecName: "kube-api-access-hlccw") pod "871e935e-c9ec-4798-a014-6a9fe6cd1fbd" (UID: "871e935e-c9ec-4798-a014-6a9fe6cd1fbd"). InnerVolumeSpecName "kube-api-access-hlccw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.217246 4574 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.217581 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlccw\" (UniqueName: \"kubernetes.io/projected/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-kube-api-access-hlccw\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.217646 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.233451 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "871e935e-c9ec-4798-a014-6a9fe6cd1fbd" (UID: "871e935e-c9ec-4798-a014-6a9fe6cd1fbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.248886 4574 generic.go:334] "Generic (PLEG): container finished" podID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerID="431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858" exitCode=0 Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.249072 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-848d9d6b7d-xxqvb" event={"ID":"871e935e-c9ec-4798-a014-6a9fe6cd1fbd","Type":"ContainerDied","Data":"431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858"} Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.249106 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-848d9d6b7d-xxqvb" event={"ID":"871e935e-c9ec-4798-a014-6a9fe6cd1fbd","Type":"ContainerDied","Data":"fdfb7bc705ef5afb36cc5dcab327f1c790613d249173d8fc5e174a40b687b12b"} Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.249126 4574 scope.go:117] "RemoveContainer" containerID="431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.249367 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-848d9d6b7d-xxqvb" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.274180 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data" (OuterVolumeSpecName: "config-data") pod "871e935e-c9ec-4798-a014-6a9fe6cd1fbd" (UID: "871e935e-c9ec-4798-a014-6a9fe6cd1fbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.328115 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.328144 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/871e935e-c9ec-4798-a014-6a9fe6cd1fbd-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.366988 4574 generic.go:334] "Generic (PLEG): container finished" podID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerID="d0e7b99d7a00461a72b61d28a9ac2d9573a5ae27a9cd9abd5ab9c3411bcc8126" exitCode=0 Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.367959 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d552b9ab-5c77-4524-b268-ea27f9a661c4","Type":"ContainerDied","Data":"d0e7b99d7a00461a72b61d28a9ac2d9573a5ae27a9cd9abd5ab9c3411bcc8126"} Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.384139 4574 generic.go:334] "Generic (PLEG): container finished" podID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerID="5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" exitCode=2 Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.384186 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerDied","Data":"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b"} Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.430577 4574 scope.go:117] "RemoveContainer" containerID="2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.559617 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.595635 4574 scope.go:117] "RemoveContainer" containerID="431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858" Oct 04 05:06:30 crc kubenswrapper[4574]: E1004 05:06:30.602043 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858\": container with ID starting with 431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858 not found: ID does not exist" containerID="431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.602102 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858"} err="failed to get container status \"431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858\": rpc error: code = NotFound desc = could not find container \"431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858\": container with ID starting with 431ac9fc2d3bb08c68f81526e7f75910e69cc4be7453e26aedd1e487b1d38858 not found: ID does not exist" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.602131 4574 scope.go:117] "RemoveContainer" containerID="2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37" Oct 04 05:06:30 crc kubenswrapper[4574]: E1004 05:06:30.602495 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37\": container with ID starting with 2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37 not found: ID does not exist" containerID="2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.602535 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37"} err="failed to get container status \"2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37\": rpc error: code = NotFound desc = could not find container \"2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37\": container with ID starting with 2cbbed83b31e6407a6929696f3f722843090adbd6bd642dabc9b109f63f0ab37 not found: ID does not exist" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.684265 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-848d9d6b7d-xxqvb"] Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.703939 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-848d9d6b7d-xxqvb"] Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.740622 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/d552b9ab-5c77-4524-b268-ea27f9a661c4-kube-api-access-5j7f6\") pod \"d552b9ab-5c77-4524-b268-ea27f9a661c4\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.740687 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d552b9ab-5c77-4524-b268-ea27f9a661c4-etc-machine-id\") pod \"d552b9ab-5c77-4524-b268-ea27f9a661c4\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.740798 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data\") pod \"d552b9ab-5c77-4524-b268-ea27f9a661c4\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.740844 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-scripts\") pod \"d552b9ab-5c77-4524-b268-ea27f9a661c4\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.741011 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data-custom\") pod \"d552b9ab-5c77-4524-b268-ea27f9a661c4\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.741035 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-combined-ca-bundle\") pod \"d552b9ab-5c77-4524-b268-ea27f9a661c4\" (UID: \"d552b9ab-5c77-4524-b268-ea27f9a661c4\") " Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.741426 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d552b9ab-5c77-4524-b268-ea27f9a661c4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d552b9ab-5c77-4524-b268-ea27f9a661c4" (UID: "d552b9ab-5c77-4524-b268-ea27f9a661c4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.741839 4574 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d552b9ab-5c77-4524-b268-ea27f9a661c4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.752961 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d552b9ab-5c77-4524-b268-ea27f9a661c4" (UID: "d552b9ab-5c77-4524-b268-ea27f9a661c4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.753549 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" path="/var/lib/kubelet/pods/871e935e-c9ec-4798-a014-6a9fe6cd1fbd/volumes" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.762141 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d552b9ab-5c77-4524-b268-ea27f9a661c4-kube-api-access-5j7f6" (OuterVolumeSpecName: "kube-api-access-5j7f6") pod "d552b9ab-5c77-4524-b268-ea27f9a661c4" (UID: "d552b9ab-5c77-4524-b268-ea27f9a661c4"). InnerVolumeSpecName "kube-api-access-5j7f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.764135 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-scripts" (OuterVolumeSpecName: "scripts") pod "d552b9ab-5c77-4524-b268-ea27f9a661c4" (UID: "d552b9ab-5c77-4524-b268-ea27f9a661c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.823170 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d552b9ab-5c77-4524-b268-ea27f9a661c4" (UID: "d552b9ab-5c77-4524-b268-ea27f9a661c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.846619 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.846645 4574 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.846657 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.846668 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j7f6\" (UniqueName: \"kubernetes.io/projected/d552b9ab-5c77-4524-b268-ea27f9a661c4-kube-api-access-5j7f6\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.921618 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data" (OuterVolumeSpecName: "config-data") pod "d552b9ab-5c77-4524-b268-ea27f9a661c4" (UID: "d552b9ab-5c77-4524-b268-ea27f9a661c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:30 crc kubenswrapper[4574]: I1004 05:06:30.947332 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d552b9ab-5c77-4524-b268-ea27f9a661c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.085588 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.252215 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-combined-ca-bundle\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.252314 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-scripts\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.252364 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-sg-core-conf-yaml\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.252598 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-log-httpd\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.253127 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.253187 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74wqz\" (UniqueName: \"kubernetes.io/projected/8cb7137b-948e-40ee-9424-267fdeb9e1c0-kube-api-access-74wqz\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.253628 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-config-data\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.253691 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-run-httpd\") pod \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\" (UID: \"8cb7137b-948e-40ee-9424-267fdeb9e1c0\") " Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.254215 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.254566 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.266507 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb7137b-948e-40ee-9424-267fdeb9e1c0-kube-api-access-74wqz" (OuterVolumeSpecName: "kube-api-access-74wqz") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "kube-api-access-74wqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.268580 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-scripts" (OuterVolumeSpecName: "scripts") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.339511 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.355573 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74wqz\" (UniqueName: \"kubernetes.io/projected/8cb7137b-948e-40ee-9424-267fdeb9e1c0-kube-api-access-74wqz\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.355620 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb7137b-948e-40ee-9424-267fdeb9e1c0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.355636 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.355648 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.401296 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d552b9ab-5c77-4524-b268-ea27f9a661c4","Type":"ContainerDied","Data":"cc9d69b8cc202a502eaf0a4602926de8f6a412f4ad97a4c61b47992de5c97088"} Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.401348 4574 scope.go:117] "RemoveContainer" containerID="14e3cf87af38427fbca34d9160e948e91e0ea7040d8e15302a99168afb46a53c" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.401450 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437285 4574 generic.go:334] "Generic (PLEG): container finished" podID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerID="8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" exitCode=0 Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437320 4574 generic.go:334] "Generic (PLEG): container finished" podID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerID="3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" exitCode=0 Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437327 4574 generic.go:334] "Generic (PLEG): container finished" podID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerID="c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" exitCode=0 Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437376 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerDied","Data":"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5"} Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437404 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerDied","Data":"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79"} Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437415 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerDied","Data":"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384"} Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437427 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cb7137b-948e-40ee-9424-267fdeb9e1c0","Type":"ContainerDied","Data":"b15ae781cf81c396c20707d63ed388bb2682c5437fba92e132db24511c2e539d"} Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.437518 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.451708 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.462491 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.467662 4574 scope.go:117] "RemoveContainer" containerID="d0e7b99d7a00461a72b61d28a9ac2d9573a5ae27a9cd9abd5ab9c3411bcc8126" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.493247 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-config-data" (OuterVolumeSpecName: "config-data") pod "8cb7137b-948e-40ee-9424-267fdeb9e1c0" (UID: "8cb7137b-948e-40ee-9424-267fdeb9e1c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.502364 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.503205 4574 scope.go:117] "RemoveContainer" containerID="8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.510338 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530182 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530603 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270e012c-9f36-48e8-8485-b38005557964" containerName="mariadb-account-create" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530622 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="270e012c-9f36-48e8-8485-b38005557964" containerName="mariadb-account-create" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530641 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="cinder-scheduler" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530647 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="cinder-scheduler" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530661 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api-log" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530675 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api-log" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530692 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-central-agent" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530700 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-central-agent" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530718 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="sg-core" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530724 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="sg-core" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530746 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="proxy-httpd" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530753 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="proxy-httpd" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530769 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16c650e-b479-4611-b9cf-2085522591c6" containerName="mariadb-account-create" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530774 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16c650e-b479-4611-b9cf-2085522591c6" containerName="mariadb-account-create" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530787 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-notification-agent" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530793 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-notification-agent" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530804 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="probe" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530809 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="probe" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.530820 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.530827 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531004 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16c650e-b479-4611-b9cf-2085522591c6" containerName="mariadb-account-create" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531025 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="probe" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531040 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531051 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-notification-agent" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531060 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="270e012c-9f36-48e8-8485-b38005557964" containerName="mariadb-account-create" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531073 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="sg-core" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531083 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="proxy-httpd" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531094 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="871e935e-c9ec-4798-a014-6a9fe6cd1fbd" containerName="barbican-api-log" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531103 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" containerName="ceilometer-central-agent" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.531124 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" containerName="cinder-scheduler" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.532112 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.536154 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.559668 4574 scope.go:117] "RemoveContainer" containerID="5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.565590 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.565732 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-scripts\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.565818 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/161f98e1-5520-4148-8565-05394e7e8daf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.565908 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-config-data\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.565944 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fc4c\" (UniqueName: \"kubernetes.io/projected/161f98e1-5520-4148-8565-05394e7e8daf-kube-api-access-8fc4c\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.565975 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.566209 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb7137b-948e-40ee-9424-267fdeb9e1c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.585718 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.624018 4574 scope.go:117] "RemoveContainer" containerID="3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.659892 4574 scope.go:117] "RemoveContainer" containerID="c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.668917 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/161f98e1-5520-4148-8565-05394e7e8daf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.669050 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-config-data\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.669084 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fc4c\" (UniqueName: \"kubernetes.io/projected/161f98e1-5520-4148-8565-05394e7e8daf-kube-api-access-8fc4c\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.669121 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.669184 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.669312 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-scripts\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.669906 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/161f98e1-5520-4148-8565-05394e7e8daf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.679132 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-scripts\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.680656 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-config-data\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.682495 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.686708 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rzg59"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.688124 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.694584 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161f98e1-5520-4148-8565-05394e7e8daf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.695345 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.695896 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bl8s2" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.696432 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.700696 4574 scope.go:117] "RemoveContainer" containerID="8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.701547 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": container with ID starting with 8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5 not found: ID does not exist" containerID="8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.701596 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5"} err="failed to get container status \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": rpc error: code = NotFound desc = could not find container \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": container with ID starting with 8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.701624 4574 scope.go:117] "RemoveContainer" containerID="5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.707094 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": container with ID starting with 5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b not found: ID does not exist" containerID="5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.707131 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b"} err="failed to get container status \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": rpc error: code = NotFound desc = could not find container \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": container with ID starting with 5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.707156 4574 scope.go:117] "RemoveContainer" containerID="3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.717018 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rzg59"] Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.721062 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": container with ID starting with 3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79 not found: ID does not exist" containerID="3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.721109 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79"} err="failed to get container status \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": rpc error: code = NotFound desc = could not find container \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": container with ID starting with 3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.721144 4574 scope.go:117] "RemoveContainer" containerID="c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.722749 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fc4c\" (UniqueName: \"kubernetes.io/projected/161f98e1-5520-4148-8565-05394e7e8daf-kube-api-access-8fc4c\") pod \"cinder-scheduler-0\" (UID: \"161f98e1-5520-4148-8565-05394e7e8daf\") " pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: E1004 05:06:31.736558 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": container with ID starting with c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384 not found: ID does not exist" containerID="c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.736645 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384"} err="failed to get container status \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": rpc error: code = NotFound desc = could not find container \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": container with ID starting with c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.736685 4574 scope.go:117] "RemoveContainer" containerID="8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.737157 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5"} err="failed to get container status \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": rpc error: code = NotFound desc = could not find container \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": container with ID starting with 8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.737185 4574 scope.go:117] "RemoveContainer" containerID="5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.739566 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b"} err="failed to get container status \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": rpc error: code = NotFound desc = could not find container \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": container with ID starting with 5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.739623 4574 scope.go:117] "RemoveContainer" containerID="3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.740556 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79"} err="failed to get container status \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": rpc error: code = NotFound desc = could not find container \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": container with ID starting with 3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.740632 4574 scope.go:117] "RemoveContainer" containerID="c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.743859 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384"} err="failed to get container status \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": rpc error: code = NotFound desc = could not find container \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": container with ID starting with c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.743901 4574 scope.go:117] "RemoveContainer" containerID="8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.744922 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5"} err="failed to get container status \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": rpc error: code = NotFound desc = could not find container \"8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5\": container with ID starting with 8df7c4cf46ceebfd978c407607f4ea0349dcd8830d24b42a32f7f53a13ea3db5 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.744992 4574 scope.go:117] "RemoveContainer" containerID="5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.745331 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b"} err="failed to get container status \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": rpc error: code = NotFound desc = could not find container \"5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b\": container with ID starting with 5e1773c95dcc40af87675c9b19bc0c75d981041856a6ed10afcb5bf85d93794b not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.745385 4574 scope.go:117] "RemoveContainer" containerID="3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.746399 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79"} err="failed to get container status \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": rpc error: code = NotFound desc = could not find container \"3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79\": container with ID starting with 3dc514df503e962b0b71f81436d4a0e94329aa344076be4a6d079fc398dbff79 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.746458 4574 scope.go:117] "RemoveContainer" containerID="c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.746963 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384"} err="failed to get container status \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": rpc error: code = NotFound desc = could not find container \"c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384\": container with ID starting with c9e8848ef89146ac8bf3ebf38aa39ec9da0f1aa082215c644da99f6cd9945384 not found: ID does not exist" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.770676 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkd27\" (UniqueName: \"kubernetes.io/projected/232b9769-2677-4ce8-991e-a8b94b2e5de1-kube-api-access-qkd27\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.770739 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-scripts\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.770894 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-config-data\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.770925 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.826346 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.860423 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.873634 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.875526 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkd27\" (UniqueName: \"kubernetes.io/projected/232b9769-2677-4ce8-991e-a8b94b2e5de1-kube-api-access-qkd27\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.875692 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-scripts\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.875848 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-config-data\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.875940 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.887054 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-config-data\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.888065 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.888354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-scripts\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.902821 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkd27\" (UniqueName: \"kubernetes.io/projected/232b9769-2677-4ce8-991e-a8b94b2e5de1-kube-api-access-qkd27\") pod \"nova-cell0-conductor-db-sync-rzg59\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.911602 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.915696 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.919772 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.930117 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:31 crc kubenswrapper[4574]: I1004 05:06:31.935775 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.047611 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.088748 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-config-data\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.088889 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.088998 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd2q5\" (UniqueName: \"kubernetes.io/projected/651b5d55-242f-4512-8ffc-d59a70f71660-kube-api-access-rd2q5\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.089053 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-run-httpd\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.089147 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-scripts\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.089322 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-log-httpd\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.089353 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191652 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-log-httpd\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191696 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191722 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-config-data\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191781 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191817 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd2q5\" (UniqueName: \"kubernetes.io/projected/651b5d55-242f-4512-8ffc-d59a70f71660-kube-api-access-rd2q5\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191855 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-run-httpd\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.191897 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-scripts\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.193007 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-log-httpd\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.193566 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-run-httpd\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.205224 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.211473 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.219474 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-config-data\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.233781 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd2q5\" (UniqueName: \"kubernetes.io/projected/651b5d55-242f-4512-8ffc-d59a70f71660-kube-api-access-rd2q5\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.234083 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-scripts\") pod \"ceilometer-0\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.261490 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.562667 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.780765 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb7137b-948e-40ee-9424-267fdeb9e1c0" path="/var/lib/kubelet/pods/8cb7137b-948e-40ee-9424-267fdeb9e1c0/volumes" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.782009 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d552b9ab-5c77-4524-b268-ea27f9a661c4" path="/var/lib/kubelet/pods/d552b9ab-5c77-4524-b268-ea27f9a661c4/volumes" Oct 04 05:06:32 crc kubenswrapper[4574]: I1004 05:06:32.976199 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rzg59"] Oct 04 05:06:33 crc kubenswrapper[4574]: I1004 05:06:33.033733 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:33 crc kubenswrapper[4574]: I1004 05:06:33.495438 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerStarted","Data":"f92f98b2d6776416a31493ab634e693ecd4a6c8babfc5e0da7855d38baf2f2b2"} Oct 04 05:06:33 crc kubenswrapper[4574]: I1004 05:06:33.501599 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"161f98e1-5520-4148-8565-05394e7e8daf","Type":"ContainerStarted","Data":"bec8431d9cbcbc23801bdc6d027c7622df35dfba57982ab329de71e30ddb70ee"} Oct 04 05:06:33 crc kubenswrapper[4574]: I1004 05:06:33.503323 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rzg59" event={"ID":"232b9769-2677-4ce8-991e-a8b94b2e5de1","Type":"ContainerStarted","Data":"90903947d96184f3f9b7a0ca3b01e030387d5fdc0946ab4f714bf5a009083e3b"} Oct 04 05:06:34 crc kubenswrapper[4574]: I1004 05:06:34.560179 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerStarted","Data":"a3d3db666eba41cd9804ed72ca012b2104dcf83e9a2305ea39d01f7f95ab1714"} Oct 04 05:06:34 crc kubenswrapper[4574]: I1004 05:06:34.622294 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"161f98e1-5520-4148-8565-05394e7e8daf","Type":"ContainerStarted","Data":"2aec892ee62b2fe79c64e574e8325cdbff7c5945e425511f1dc0a34735ff9c0d"} Oct 04 05:06:35 crc kubenswrapper[4574]: I1004 05:06:35.711083 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerStarted","Data":"926af762872a85d7a56acefa6ec843c706552f126604b0d1a7fbdc6d4fea983a"} Oct 04 05:06:35 crc kubenswrapper[4574]: I1004 05:06:35.727509 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"161f98e1-5520-4148-8565-05394e7e8daf","Type":"ContainerStarted","Data":"b039cd24763fd4e1f18d041b1cc1e5a5da385373e219d1d7dc1d9d3ab39e057f"} Oct 04 05:06:35 crc kubenswrapper[4574]: I1004 05:06:35.763816 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.763796936 podStartE2EDuration="4.763796936s" podCreationTimestamp="2025-10-04 05:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:35.749294993 +0000 UTC m=+1221.603438035" watchObservedRunningTime="2025-10-04 05:06:35.763796936 +0000 UTC m=+1221.617939978" Oct 04 05:06:36 crc kubenswrapper[4574]: I1004 05:06:36.745819 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerStarted","Data":"49d6712f9d9956a7435656f3240a96899eed69061a0b0547b2ed8f36ea03b770"} Oct 04 05:06:36 crc kubenswrapper[4574]: I1004 05:06:36.861825 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 04 05:06:38 crc kubenswrapper[4574]: I1004 05:06:38.771452 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerStarted","Data":"8b47bb35aa9cb33ff8d0d33163c8e6a7cbfdec6285ea9c8fbf5281ea6a916d48"} Oct 04 05:06:38 crc kubenswrapper[4574]: I1004 05:06:38.772089 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:06:38 crc kubenswrapper[4574]: I1004 05:06:38.807527 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.346470648 podStartE2EDuration="7.807504398s" podCreationTimestamp="2025-10-04 05:06:31 +0000 UTC" firstStartedPulling="2025-10-04 05:06:33.078815679 +0000 UTC m=+1218.932958731" lastFinishedPulling="2025-10-04 05:06:37.539849449 +0000 UTC m=+1223.393992481" observedRunningTime="2025-10-04 05:06:38.800728005 +0000 UTC m=+1224.654871057" watchObservedRunningTime="2025-10-04 05:06:38.807504398 +0000 UTC m=+1224.661647440" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.713788 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.714131 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.714890 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"6d70675baea48ecf302727fcdbee1acb4f0596c28e0e43a0d20ca60fc171a6ba"} pod="openstack/horizon-57c7ff446b-7tmwn" containerMessage="Container horizon failed startup probe, will be restarted" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.714940 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" containerID="cri-o://6d70675baea48ecf302727fcdbee1acb4f0596c28e0e43a0d20ca60fc171a6ba" gracePeriod=30 Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.841030 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.841126 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.842074 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"7e11226527b91eec1e02086808d22606900e396ca4000781b4ed8449905f45e8"} pod="openstack/horizon-57bfb4d496-nv6hv" containerMessage="Container horizon failed startup probe, will be restarted" Oct 04 05:06:39 crc kubenswrapper[4574]: I1004 05:06:39.842146 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" containerID="cri-o://7e11226527b91eec1e02086808d22606900e396ca4000781b4ed8449905f45e8" gracePeriod=30 Oct 04 05:06:41 crc kubenswrapper[4574]: I1004 05:06:41.817055 4574 generic.go:334] "Generic (PLEG): container finished" podID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerID="5822fae071cbbe935e582094b21ed25c939bd3f5fa1945a0c757be037c4c79a5" exitCode=137 Oct 04 05:06:41 crc kubenswrapper[4574]: I1004 05:06:41.817151 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"532cf81d-d285-432f-a77e-4b1b04f4388c","Type":"ContainerDied","Data":"5822fae071cbbe935e582094b21ed25c939bd3f5fa1945a0c757be037c4c79a5"} Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.173994 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.175051 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="proxy-httpd" containerID="cri-o://8b47bb35aa9cb33ff8d0d33163c8e6a7cbfdec6285ea9c8fbf5281ea6a916d48" gracePeriod=30 Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.175285 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="sg-core" containerID="cri-o://49d6712f9d9956a7435656f3240a96899eed69061a0b0547b2ed8f36ea03b770" gracePeriod=30 Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.175417 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-notification-agent" containerID="cri-o://926af762872a85d7a56acefa6ec843c706552f126604b0d1a7fbdc6d4fea983a" gracePeriod=30 Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.175002 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-central-agent" containerID="cri-o://a3d3db666eba41cd9804ed72ca012b2104dcf83e9a2305ea39d01f7f95ab1714" gracePeriod=30 Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.406227 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.861469 4574 generic.go:334] "Generic (PLEG): container finished" podID="651b5d55-242f-4512-8ffc-d59a70f71660" containerID="8b47bb35aa9cb33ff8d0d33163c8e6a7cbfdec6285ea9c8fbf5281ea6a916d48" exitCode=0 Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.861506 4574 generic.go:334] "Generic (PLEG): container finished" podID="651b5d55-242f-4512-8ffc-d59a70f71660" containerID="49d6712f9d9956a7435656f3240a96899eed69061a0b0547b2ed8f36ea03b770" exitCode=2 Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.861530 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerDied","Data":"8b47bb35aa9cb33ff8d0d33163c8e6a7cbfdec6285ea9c8fbf5281ea6a916d48"} Oct 04 05:06:42 crc kubenswrapper[4574]: I1004 05:06:42.861560 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerDied","Data":"49d6712f9d9956a7435656f3240a96899eed69061a0b0547b2ed8f36ea03b770"} Oct 04 05:06:43 crc kubenswrapper[4574]: I1004 05:06:43.871852 4574 generic.go:334] "Generic (PLEG): container finished" podID="651b5d55-242f-4512-8ffc-d59a70f71660" containerID="926af762872a85d7a56acefa6ec843c706552f126604b0d1a7fbdc6d4fea983a" exitCode=0 Oct 04 05:06:43 crc kubenswrapper[4574]: I1004 05:06:43.871912 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerDied","Data":"926af762872a85d7a56acefa6ec843c706552f126604b0d1a7fbdc6d4fea983a"} Oct 04 05:06:46 crc kubenswrapper[4574]: I1004 05:06:46.190784 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.173:8776/healthcheck\": dial tcp 10.217.0.173:8776: connect: connection refused" Oct 04 05:06:47 crc kubenswrapper[4574]: I1004 05:06:47.917387 4574 generic.go:334] "Generic (PLEG): container finished" podID="651b5d55-242f-4512-8ffc-d59a70f71660" containerID="a3d3db666eba41cd9804ed72ca012b2104dcf83e9a2305ea39d01f7f95ab1714" exitCode=0 Oct 04 05:06:47 crc kubenswrapper[4574]: I1004 05:06:47.917464 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerDied","Data":"a3d3db666eba41cd9804ed72ca012b2104dcf83e9a2305ea39d01f7f95ab1714"} Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.515799 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.558987 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrj4k\" (UniqueName: \"kubernetes.io/projected/532cf81d-d285-432f-a77e-4b1b04f4388c-kube-api-access-lrj4k\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559064 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559083 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/532cf81d-d285-432f-a77e-4b1b04f4388c-etc-machine-id\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559128 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532cf81d-d285-432f-a77e-4b1b04f4388c-logs\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559222 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-combined-ca-bundle\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559295 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-scripts\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559362 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data-custom\") pod \"532cf81d-d285-432f-a77e-4b1b04f4388c\" (UID: \"532cf81d-d285-432f-a77e-4b1b04f4388c\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559508 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532cf81d-d285-432f-a77e-4b1b04f4388c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.559998 4574 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/532cf81d-d285-432f-a77e-4b1b04f4388c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.560416 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532cf81d-d285-432f-a77e-4b1b04f4388c-logs" (OuterVolumeSpecName: "logs") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.574986 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532cf81d-d285-432f-a77e-4b1b04f4388c-kube-api-access-lrj4k" (OuterVolumeSpecName: "kube-api-access-lrj4k") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "kube-api-access-lrj4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.608635 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.627771 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-scripts" (OuterVolumeSpecName: "scripts") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.668352 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532cf81d-d285-432f-a77e-4b1b04f4388c-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.668387 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.668398 4574 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.668432 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrj4k\" (UniqueName: \"kubernetes.io/projected/532cf81d-d285-432f-a77e-4b1b04f4388c-kube-api-access-lrj4k\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.714137 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.772444 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.782450 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data" (OuterVolumeSpecName: "config-data") pod "532cf81d-d285-432f-a77e-4b1b04f4388c" (UID: "532cf81d-d285-432f-a77e-4b1b04f4388c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.832522 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.875787 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532cf81d-d285-432f-a77e-4b1b04f4388c-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.960393 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rzg59" event={"ID":"232b9769-2677-4ce8-991e-a8b94b2e5de1","Type":"ContainerStarted","Data":"6d46930556fab887ac8159f85f1d1236bb3505d414e9dbb9f4a82b03c276d8f5"} Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.972226 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"532cf81d-d285-432f-a77e-4b1b04f4388c","Type":"ContainerDied","Data":"85d04f5fb8160911dfd6995bcdd7557645a153709c32c83e6994499fdada5b93"} Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.972296 4574 scope.go:117] "RemoveContainer" containerID="5822fae071cbbe935e582094b21ed25c939bd3f5fa1945a0c757be037c4c79a5" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.972430 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.994870 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd2q5\" (UniqueName: \"kubernetes.io/projected/651b5d55-242f-4512-8ffc-d59a70f71660-kube-api-access-rd2q5\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.994975 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-log-httpd\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.995009 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-run-httpd\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.995035 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-combined-ca-bundle\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.995058 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-sg-core-conf-yaml\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.995110 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-config-data\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.995178 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-scripts\") pod \"651b5d55-242f-4512-8ffc-d59a70f71660\" (UID: \"651b5d55-242f-4512-8ffc-d59a70f71660\") " Oct 04 05:06:50 crc kubenswrapper[4574]: I1004 05:06:50.998811 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.000630 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.013791 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-scripts" (OuterVolumeSpecName: "scripts") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.013934 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651b5d55-242f-4512-8ffc-d59a70f71660-kube-api-access-rd2q5" (OuterVolumeSpecName: "kube-api-access-rd2q5") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "kube-api-access-rd2q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.024259 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rzg59" podStartSLOduration=2.786330091 podStartE2EDuration="20.024219291s" podCreationTimestamp="2025-10-04 05:06:31 +0000 UTC" firstStartedPulling="2025-10-04 05:06:32.994802497 +0000 UTC m=+1218.848945539" lastFinishedPulling="2025-10-04 05:06:50.232691697 +0000 UTC m=+1236.086834739" observedRunningTime="2025-10-04 05:06:50.994288189 +0000 UTC m=+1236.848431231" watchObservedRunningTime="2025-10-04 05:06:51.024219291 +0000 UTC m=+1236.878362333" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.033030 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"651b5d55-242f-4512-8ffc-d59a70f71660","Type":"ContainerDied","Data":"f92f98b2d6776416a31493ab634e693ecd4a6c8babfc5e0da7855d38baf2f2b2"} Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.033108 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.088323 4574 scope.go:117] "RemoveContainer" containerID="de48f94498070689a4fb364de069bc21c63e13851065e9aafa43e5ffbfe49078" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.090713 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.133082 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.133112 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/651b5d55-242f-4512-8ffc-d59a70f71660-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.133129 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.133142 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd2q5\" (UniqueName: \"kubernetes.io/projected/651b5d55-242f-4512-8ffc-d59a70f71660-kube-api-access-rd2q5\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.190398 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.243387 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: E1004 05:06:51.243920 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-central-agent" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.243954 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-central-agent" Oct 04 05:06:51 crc kubenswrapper[4574]: E1004 05:06:51.243971 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="sg-core" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.243977 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="sg-core" Oct 04 05:06:51 crc kubenswrapper[4574]: E1004 05:06:51.243989 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.243995 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api" Oct 04 05:06:51 crc kubenswrapper[4574]: E1004 05:06:51.244003 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-notification-agent" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244009 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-notification-agent" Oct 04 05:06:51 crc kubenswrapper[4574]: E1004 05:06:51.244040 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="proxy-httpd" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244046 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="proxy-httpd" Oct 04 05:06:51 crc kubenswrapper[4574]: E1004 05:06:51.244074 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api-log" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244080 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api-log" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244290 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244305 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" containerName="cinder-api-log" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244327 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-notification-agent" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244344 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="ceilometer-central-agent" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244358 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="sg-core" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.244366 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" containerName="proxy-httpd" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.254343 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.259468 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.259738 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.259919 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.273145 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.274947 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.295385 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.325044 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-config-data" (OuterVolumeSpecName: "config-data") pod "651b5d55-242f-4512-8ffc-d59a70f71660" (UID: "651b5d55-242f-4512-8ffc-d59a70f71660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.340253 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38984f83-1657-45b8-bcd4-448c2306ea86-logs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.340423 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wzsc\" (UniqueName: \"kubernetes.io/projected/38984f83-1657-45b8-bcd4-448c2306ea86-kube-api-access-2wzsc\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.340557 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-config-data\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.340659 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-config-data-custom\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.340803 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-scripts\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.340918 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.341017 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38984f83-1657-45b8-bcd4-448c2306ea86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.341128 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.341227 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-public-tls-certs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.341514 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.341597 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.341676 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651b5d55-242f-4512-8ffc-d59a70f71660-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.403433 4574 scope.go:117] "RemoveContainer" containerID="8b47bb35aa9cb33ff8d0d33163c8e6a7cbfdec6285ea9c8fbf5281ea6a916d48" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.431769 4574 scope.go:117] "RemoveContainer" containerID="49d6712f9d9956a7435656f3240a96899eed69061a0b0547b2ed8f36ea03b770" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.443779 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.443835 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38984f83-1657-45b8-bcd4-448c2306ea86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.443883 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.443908 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-public-tls-certs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.443974 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38984f83-1657-45b8-bcd4-448c2306ea86-logs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.444007 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wzsc\" (UniqueName: \"kubernetes.io/projected/38984f83-1657-45b8-bcd4-448c2306ea86-kube-api-access-2wzsc\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.444042 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-config-data\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.444067 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-config-data-custom\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.444140 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-scripts\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.447973 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-scripts\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.448322 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38984f83-1657-45b8-bcd4-448c2306ea86-logs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.450814 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38984f83-1657-45b8-bcd4-448c2306ea86-etc-machine-id\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.452816 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-public-tls-certs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.454286 4574 scope.go:117] "RemoveContainer" containerID="926af762872a85d7a56acefa6ec843c706552f126604b0d1a7fbdc6d4fea983a" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.458487 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.459702 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.463215 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-config-data-custom\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.463968 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.472945 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38984f83-1657-45b8-bcd4-448c2306ea86-config-data\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.486115 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.487851 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wzsc\" (UniqueName: \"kubernetes.io/projected/38984f83-1657-45b8-bcd4-448c2306ea86-kube-api-access-2wzsc\") pod \"cinder-api-0\" (UID: \"38984f83-1657-45b8-bcd4-448c2306ea86\") " pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.496157 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.499699 4574 scope.go:117] "RemoveContainer" containerID="a3d3db666eba41cd9804ed72ca012b2104dcf83e9a2305ea39d01f7f95ab1714" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.509965 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.520127 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.520372 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.534789 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.647799 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.647842 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.648003 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5sw\" (UniqueName: \"kubernetes.io/projected/f1b68347-847d-406d-a6c9-b99884baf6c0-kube-api-access-zr5sw\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.648021 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.648049 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-scripts\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.648165 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.648226 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-config-data\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.734845 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750112 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-config-data\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750191 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750217 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750335 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5sw\" (UniqueName: \"kubernetes.io/projected/f1b68347-847d-406d-a6c9-b99884baf6c0-kube-api-access-zr5sw\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750359 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750397 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-scripts\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.750430 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.751605 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.751823 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.756946 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.757617 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.760885 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-config-data\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.762528 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-scripts\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.774358 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5sw\" (UniqueName: \"kubernetes.io/projected/f1b68347-847d-406d-a6c9-b99884baf6c0-kube-api-access-zr5sw\") pod \"ceilometer-0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " pod="openstack/ceilometer-0" Oct 04 05:06:51 crc kubenswrapper[4574]: I1004 05:06:51.841548 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:06:52 crc kubenswrapper[4574]: I1004 05:06:52.374964 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 04 05:06:52 crc kubenswrapper[4574]: I1004 05:06:52.517073 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:06:52 crc kubenswrapper[4574]: I1004 05:06:52.746147 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532cf81d-d285-432f-a77e-4b1b04f4388c" path="/var/lib/kubelet/pods/532cf81d-d285-432f-a77e-4b1b04f4388c/volumes" Oct 04 05:06:52 crc kubenswrapper[4574]: I1004 05:06:52.748615 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651b5d55-242f-4512-8ffc-d59a70f71660" path="/var/lib/kubelet/pods/651b5d55-242f-4512-8ffc-d59a70f71660/volumes" Oct 04 05:06:53 crc kubenswrapper[4574]: I1004 05:06:53.078469 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerStarted","Data":"35837e7b8737989ce90daf745529730a160bc7cf03a1b747c66976e81f819a6b"} Oct 04 05:06:53 crc kubenswrapper[4574]: I1004 05:06:53.079953 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38984f83-1657-45b8-bcd4-448c2306ea86","Type":"ContainerStarted","Data":"7e08f17aa68569bdedfdc726140a78bb66789c0991c9233ec2906fe01bc24450"} Oct 04 05:06:53 crc kubenswrapper[4574]: I1004 05:06:53.716671 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:06:53 crc kubenswrapper[4574]: I1004 05:06:53.717278 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-log" containerID="cri-o://c65a0665b928d1cf8419e151c5471d2eda1ec225554cc3eda9929279ea550887" gracePeriod=30 Oct 04 05:06:53 crc kubenswrapper[4574]: I1004 05:06:53.717417 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-httpd" containerID="cri-o://b8aec1b4b90597eb94fb6acd08dfb480d7db36d403006e7977e040ab6158f37a" gracePeriod=30 Oct 04 05:06:54 crc kubenswrapper[4574]: I1004 05:06:54.101213 4574 generic.go:334] "Generic (PLEG): container finished" podID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerID="c65a0665b928d1cf8419e151c5471d2eda1ec225554cc3eda9929279ea550887" exitCode=143 Oct 04 05:06:54 crc kubenswrapper[4574]: I1004 05:06:54.101565 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8ebe95c-3128-46bd-8529-b87b860a6098","Type":"ContainerDied","Data":"c65a0665b928d1cf8419e151c5471d2eda1ec225554cc3eda9929279ea550887"} Oct 04 05:06:54 crc kubenswrapper[4574]: I1004 05:06:54.102619 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38984f83-1657-45b8-bcd4-448c2306ea86","Type":"ContainerStarted","Data":"05158f7e1ca27e9818f1c3000e269fba1be3223988f30b463a8142c80ff3dcba"} Oct 04 05:06:54 crc kubenswrapper[4574]: I1004 05:06:54.103958 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerStarted","Data":"2a3bae4e2a51ef376c752e5f90e6a3bbea5aaf3cd5c5f3a86c4c41af0401f83f"} Oct 04 05:06:55 crc kubenswrapper[4574]: I1004 05:06:55.117501 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerStarted","Data":"069df0229c63165d1cf180b877fb44c309fea2b96410dffa33f1b9c2f38562f9"} Oct 04 05:06:55 crc kubenswrapper[4574]: I1004 05:06:55.120049 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"38984f83-1657-45b8-bcd4-448c2306ea86","Type":"ContainerStarted","Data":"ec887f9f1aaaf37c2113ed295c0d2a01d571d34dc76b6206cbcaaf919f4c653a"} Oct 04 05:06:55 crc kubenswrapper[4574]: I1004 05:06:55.120308 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 04 05:06:55 crc kubenswrapper[4574]: I1004 05:06:55.155073 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.1550533 podStartE2EDuration="4.1550533s" podCreationTimestamp="2025-10-04 05:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:06:55.151935901 +0000 UTC m=+1241.006078943" watchObservedRunningTime="2025-10-04 05:06:55.1550533 +0000 UTC m=+1241.009196342" Oct 04 05:06:56 crc kubenswrapper[4574]: I1004 05:06:56.137027 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerStarted","Data":"e6eec2c2e234b2d430a42c5454c9684cdf163c8126ee2222519cd4b6db36263b"} Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.149601 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerStarted","Data":"3253de83c681d46829352163d14a80428d92664663b51e31072b7b0bea552bd6"} Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.150059 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.152598 4574 generic.go:334] "Generic (PLEG): container finished" podID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerID="b8aec1b4b90597eb94fb6acd08dfb480d7db36d403006e7977e040ab6158f37a" exitCode=0 Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.152625 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8ebe95c-3128-46bd-8529-b87b860a6098","Type":"ContainerDied","Data":"b8aec1b4b90597eb94fb6acd08dfb480d7db36d403006e7977e040ab6158f37a"} Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.179427 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.559333441 podStartE2EDuration="6.17940858s" podCreationTimestamp="2025-10-04 05:06:51 +0000 UTC" firstStartedPulling="2025-10-04 05:06:52.538956653 +0000 UTC m=+1238.393099695" lastFinishedPulling="2025-10-04 05:06:56.159031792 +0000 UTC m=+1242.013174834" observedRunningTime="2025-10-04 05:06:57.177130106 +0000 UTC m=+1243.031273148" watchObservedRunningTime="2025-10-04 05:06:57.17940858 +0000 UTC m=+1243.033551622" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.448245 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.575807 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-combined-ca-bundle\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.575857 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.575896 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4ckb\" (UniqueName: \"kubernetes.io/projected/a8ebe95c-3128-46bd-8529-b87b860a6098-kube-api-access-n4ckb\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.575917 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-logs\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.576106 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-scripts\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.576129 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-httpd-run\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.576168 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-public-tls-certs\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.576203 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-config-data\") pod \"a8ebe95c-3128-46bd-8529-b87b860a6098\" (UID: \"a8ebe95c-3128-46bd-8529-b87b860a6098\") " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.577823 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-logs" (OuterVolumeSpecName: "logs") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.578058 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.585460 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-scripts" (OuterVolumeSpecName: "scripts") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.588750 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ebe95c-3128-46bd-8529-b87b860a6098-kube-api-access-n4ckb" (OuterVolumeSpecName: "kube-api-access-n4ckb") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "kube-api-access-n4ckb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.590863 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.675971 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.678492 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.678539 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.678555 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4ckb\" (UniqueName: \"kubernetes.io/projected/a8ebe95c-3128-46bd-8529-b87b860a6098-kube-api-access-n4ckb\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.678571 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.678583 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.678594 4574 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8ebe95c-3128-46bd-8529-b87b860a6098-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.694771 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.730063 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.733811 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-config-data" (OuterVolumeSpecName: "config-data") pod "a8ebe95c-3128-46bd-8529-b87b860a6098" (UID: "a8ebe95c-3128-46bd-8529-b87b860a6098"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.779779 4574 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.779811 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8ebe95c-3128-46bd-8529-b87b860a6098-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:57 crc kubenswrapper[4574]: I1004 05:06:57.779822 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.164282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8ebe95c-3128-46bd-8529-b87b860a6098","Type":"ContainerDied","Data":"d20831b8584e14c5edeb86245bae8046b56d0720bbe4e9ecd21ba5932ce485d7"} Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.164338 4574 scope.go:117] "RemoveContainer" containerID="b8aec1b4b90597eb94fb6acd08dfb480d7db36d403006e7977e040ab6158f37a" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.164351 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.205902 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.219176 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.233549 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.233689 4574 scope.go:117] "RemoveContainer" containerID="c65a0665b928d1cf8419e151c5471d2eda1ec225554cc3eda9929279ea550887" Oct 04 05:06:58 crc kubenswrapper[4574]: E1004 05:06:58.233936 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-httpd" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.233948 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-httpd" Oct 04 05:06:58 crc kubenswrapper[4574]: E1004 05:06:58.233977 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-log" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.233985 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-log" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.234227 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-log" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.234262 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" containerName="glance-httpd" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.235208 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.237699 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.238447 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.253781 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293337 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293423 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293490 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f5514e6-eceb-4683-9633-684cc13d5458-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293526 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-config-data\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293546 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f5514e6-eceb-4683-9633-684cc13d5458-logs\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293565 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm66l\" (UniqueName: \"kubernetes.io/projected/5f5514e6-eceb-4683-9633-684cc13d5458-kube-api-access-zm66l\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293607 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-scripts\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.293657 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.395646 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396036 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396090 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f5514e6-eceb-4683-9633-684cc13d5458-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396124 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-config-data\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396141 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f5514e6-eceb-4683-9633-684cc13d5458-logs\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396165 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm66l\" (UniqueName: \"kubernetes.io/projected/5f5514e6-eceb-4683-9633-684cc13d5458-kube-api-access-zm66l\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396201 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-scripts\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396315 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396672 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f5514e6-eceb-4683-9633-684cc13d5458-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396710 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.396769 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f5514e6-eceb-4683-9633-684cc13d5458-logs\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.401179 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.401671 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-scripts\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.402158 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-config-data\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.405854 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5514e6-eceb-4683-9633-684cc13d5458-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.419046 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm66l\" (UniqueName: \"kubernetes.io/projected/5f5514e6-eceb-4683-9633-684cc13d5458-kube-api-access-zm66l\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.443809 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5f5514e6-eceb-4683-9633-684cc13d5458\") " pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.569005 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 04 05:06:58 crc kubenswrapper[4574]: I1004 05:06:58.749633 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ebe95c-3128-46bd-8529-b87b860a6098" path="/var/lib/kubelet/pods/a8ebe95c-3128-46bd-8529-b87b860a6098/volumes" Oct 04 05:06:59 crc kubenswrapper[4574]: I1004 05:06:59.271210 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 04 05:06:59 crc kubenswrapper[4574]: W1004 05:06:59.282101 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f5514e6_eceb_4683_9633_684cc13d5458.slice/crio-eb70795fa6d18b32292d102749e080bee19b768557725f06c150b01497d15599 WatchSource:0}: Error finding container eb70795fa6d18b32292d102749e080bee19b768557725f06c150b01497d15599: Status 404 returned error can't find the container with id eb70795fa6d18b32292d102749e080bee19b768557725f06c150b01497d15599 Oct 04 05:06:59 crc kubenswrapper[4574]: I1004 05:06:59.666203 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:06:59 crc kubenswrapper[4574]: I1004 05:06:59.666744 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-log" containerID="cri-o://aa20b5936822cce937ad9574e370a9a1cc746c8eed61f5e21dfe9f81965eeb99" gracePeriod=30 Oct 04 05:06:59 crc kubenswrapper[4574]: I1004 05:06:59.666851 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-httpd" containerID="cri-o://37ccbd2726c9a4b46c60ff716d803cf8f50d875d530c80d97e42832cc6d84a23" gracePeriod=30 Oct 04 05:07:00 crc kubenswrapper[4574]: I1004 05:07:00.194136 4574 generic.go:334] "Generic (PLEG): container finished" podID="af57198a-6432-454b-ab0f-6e07c76f166b" containerID="aa20b5936822cce937ad9574e370a9a1cc746c8eed61f5e21dfe9f81965eeb99" exitCode=143 Oct 04 05:07:00 crc kubenswrapper[4574]: I1004 05:07:00.194520 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af57198a-6432-454b-ab0f-6e07c76f166b","Type":"ContainerDied","Data":"aa20b5936822cce937ad9574e370a9a1cc746c8eed61f5e21dfe9f81965eeb99"} Oct 04 05:07:00 crc kubenswrapper[4574]: I1004 05:07:00.199777 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5f5514e6-eceb-4683-9633-684cc13d5458","Type":"ContainerStarted","Data":"69d89d871468943bc9001df38874e1e3fc22526ec828a6209cc419bd4b05b9b3"} Oct 04 05:07:00 crc kubenswrapper[4574]: I1004 05:07:00.199834 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5f5514e6-eceb-4683-9633-684cc13d5458","Type":"ContainerStarted","Data":"eb70795fa6d18b32292d102749e080bee19b768557725f06c150b01497d15599"} Oct 04 05:07:01 crc kubenswrapper[4574]: I1004 05:07:01.210038 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5f5514e6-eceb-4683-9633-684cc13d5458","Type":"ContainerStarted","Data":"a29afa9a12e777f7e3de4a708dbfdc43906612fc3874e4a5c19ef1a7e99e3406"} Oct 04 05:07:02 crc kubenswrapper[4574]: I1004 05:07:02.615136 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.615118939 podStartE2EDuration="4.615118939s" podCreationTimestamp="2025-10-04 05:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:01.241514953 +0000 UTC m=+1247.095657995" watchObservedRunningTime="2025-10-04 05:07:02.615118939 +0000 UTC m=+1248.469261981" Oct 04 05:07:02 crc kubenswrapper[4574]: I1004 05:07:02.617424 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:02 crc kubenswrapper[4574]: I1004 05:07:02.617666 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-central-agent" containerID="cri-o://2a3bae4e2a51ef376c752e5f90e6a3bbea5aaf3cd5c5f3a86c4c41af0401f83f" gracePeriod=30 Oct 04 05:07:02 crc kubenswrapper[4574]: I1004 05:07:02.617735 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="sg-core" containerID="cri-o://e6eec2c2e234b2d430a42c5454c9684cdf163c8126ee2222519cd4b6db36263b" gracePeriod=30 Oct 04 05:07:02 crc kubenswrapper[4574]: I1004 05:07:02.617776 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-notification-agent" containerID="cri-o://069df0229c63165d1cf180b877fb44c309fea2b96410dffa33f1b9c2f38562f9" gracePeriod=30 Oct 04 05:07:02 crc kubenswrapper[4574]: I1004 05:07:02.617833 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="proxy-httpd" containerID="cri-o://3253de83c681d46829352163d14a80428d92664663b51e31072b7b0bea552bd6" gracePeriod=30 Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.236259 4574 generic.go:334] "Generic (PLEG): container finished" podID="af57198a-6432-454b-ab0f-6e07c76f166b" containerID="37ccbd2726c9a4b46c60ff716d803cf8f50d875d530c80d97e42832cc6d84a23" exitCode=0 Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.236614 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af57198a-6432-454b-ab0f-6e07c76f166b","Type":"ContainerDied","Data":"37ccbd2726c9a4b46c60ff716d803cf8f50d875d530c80d97e42832cc6d84a23"} Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.256908 4574 generic.go:334] "Generic (PLEG): container finished" podID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerID="3253de83c681d46829352163d14a80428d92664663b51e31072b7b0bea552bd6" exitCode=0 Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.256946 4574 generic.go:334] "Generic (PLEG): container finished" podID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerID="e6eec2c2e234b2d430a42c5454c9684cdf163c8126ee2222519cd4b6db36263b" exitCode=2 Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.256957 4574 generic.go:334] "Generic (PLEG): container finished" podID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerID="069df0229c63165d1cf180b877fb44c309fea2b96410dffa33f1b9c2f38562f9" exitCode=0 Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.256969 4574 generic.go:334] "Generic (PLEG): container finished" podID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerID="2a3bae4e2a51ef376c752e5f90e6a3bbea5aaf3cd5c5f3a86c4c41af0401f83f" exitCode=0 Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.256995 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerDied","Data":"3253de83c681d46829352163d14a80428d92664663b51e31072b7b0bea552bd6"} Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.257033 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerDied","Data":"e6eec2c2e234b2d430a42c5454c9684cdf163c8126ee2222519cd4b6db36263b"} Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.257047 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerDied","Data":"069df0229c63165d1cf180b877fb44c309fea2b96410dffa33f1b9c2f38562f9"} Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.257060 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerDied","Data":"2a3bae4e2a51ef376c752e5f90e6a3bbea5aaf3cd5c5f3a86c4c41af0401f83f"} Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.321629 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492391 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-config-data\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492497 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-logs\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492516 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-scripts\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492624 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492687 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-internal-tls-certs\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492804 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-combined-ca-bundle\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492845 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-httpd-run\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.492882 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9h7s\" (UniqueName: \"kubernetes.io/projected/af57198a-6432-454b-ab0f-6e07c76f166b-kube-api-access-z9h7s\") pod \"af57198a-6432-454b-ab0f-6e07c76f166b\" (UID: \"af57198a-6432-454b-ab0f-6e07c76f166b\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.493584 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-logs" (OuterVolumeSpecName: "logs") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.495123 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.500362 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af57198a-6432-454b-ab0f-6e07c76f166b-kube-api-access-z9h7s" (OuterVolumeSpecName: "kube-api-access-z9h7s") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "kube-api-access-z9h7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.504423 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-scripts" (OuterVolumeSpecName: "scripts") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.507642 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.550795 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.576295 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-config-data" (OuterVolumeSpecName: "config-data") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597583 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9h7s\" (UniqueName: \"kubernetes.io/projected/af57198a-6432-454b-ab0f-6e07c76f166b-kube-api-access-z9h7s\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597625 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597638 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597651 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597676 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597689 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.597700 4574 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af57198a-6432-454b-ab0f-6e07c76f166b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.613061 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af57198a-6432-454b-ab0f-6e07c76f166b" (UID: "af57198a-6432-454b-ab0f-6e07c76f166b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.635162 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.665770 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.699795 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.699828 4574 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af57198a-6432-454b-ab0f-6e07c76f166b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801112 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-sg-core-conf-yaml\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801190 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-run-httpd\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801216 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-scripts\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801360 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-log-httpd\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801382 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-combined-ca-bundle\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801407 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr5sw\" (UniqueName: \"kubernetes.io/projected/f1b68347-847d-406d-a6c9-b99884baf6c0-kube-api-access-zr5sw\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.801533 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-config-data\") pod \"f1b68347-847d-406d-a6c9-b99884baf6c0\" (UID: \"f1b68347-847d-406d-a6c9-b99884baf6c0\") " Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.802179 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.802312 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.807596 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b68347-847d-406d-a6c9-b99884baf6c0-kube-api-access-zr5sw" (OuterVolumeSpecName: "kube-api-access-zr5sw") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "kube-api-access-zr5sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.809725 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-scripts" (OuterVolumeSpecName: "scripts") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.841720 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.903546 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.903716 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.903791 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.903849 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b68347-847d-406d-a6c9-b99884baf6c0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.903903 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr5sw\" (UniqueName: \"kubernetes.io/projected/f1b68347-847d-406d-a6c9-b99884baf6c0-kube-api-access-zr5sw\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.919334 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:03 crc kubenswrapper[4574]: I1004 05:07:03.965323 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-config-data" (OuterVolumeSpecName: "config-data") pod "f1b68347-847d-406d-a6c9-b99884baf6c0" (UID: "f1b68347-847d-406d-a6c9-b99884baf6c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.006311 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.006632 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b68347-847d-406d-a6c9-b99884baf6c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.081199 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.278183 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af57198a-6432-454b-ab0f-6e07c76f166b","Type":"ContainerDied","Data":"c9fcb5e4507a34e5517e20b17393a213380c47fb5e131c23b97658005fd612e0"} Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.278311 4574 scope.go:117] "RemoveContainer" containerID="37ccbd2726c9a4b46c60ff716d803cf8f50d875d530c80d97e42832cc6d84a23" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.278471 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.291339 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b68347-847d-406d-a6c9-b99884baf6c0","Type":"ContainerDied","Data":"35837e7b8737989ce90daf745529730a160bc7cf03a1b747c66976e81f819a6b"} Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.291469 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.338614 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.360539 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.374752 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.391457 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.398866 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: E1004 05:07:04.399499 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-central-agent" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399523 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-central-agent" Oct 04 05:07:04 crc kubenswrapper[4574]: E1004 05:07:04.399538 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-httpd" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399546 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-httpd" Oct 04 05:07:04 crc kubenswrapper[4574]: E1004 05:07:04.399564 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="proxy-httpd" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399573 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="proxy-httpd" Oct 04 05:07:04 crc kubenswrapper[4574]: E1004 05:07:04.399591 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-notification-agent" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399601 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-notification-agent" Oct 04 05:07:04 crc kubenswrapper[4574]: E1004 05:07:04.399636 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-log" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399642 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-log" Oct 04 05:07:04 crc kubenswrapper[4574]: E1004 05:07:04.399659 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="sg-core" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399667 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="sg-core" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399898 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="sg-core" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399916 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-central-agent" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399927 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="ceilometer-notification-agent" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399940 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-httpd" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399953 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" containerName="glance-log" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.399965 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" containerName="proxy-httpd" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.401142 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.403558 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.404553 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.414783 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.416446 4574 scope.go:117] "RemoveContainer" containerID="aa20b5936822cce937ad9574e370a9a1cc746c8eed61f5e21dfe9f81965eeb99" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.424290 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.426806 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.427332 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.427581 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.443351 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.488326 4574 scope.go:117] "RemoveContainer" containerID="3253de83c681d46829352163d14a80428d92664663b51e31072b7b0bea552bd6" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521435 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521515 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-log-httpd\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521548 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq4rr\" (UniqueName: \"kubernetes.io/projected/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-kube-api-access-hq4rr\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521595 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521632 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521655 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521681 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-scripts\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521702 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-run-httpd\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521756 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jb2\" (UniqueName: \"kubernetes.io/projected/3a604386-97b7-4ced-889a-f414b194db50-kube-api-access-q2jb2\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521816 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521855 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-config-data\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521883 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521919 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521946 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.521990 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.522466 4574 scope.go:117] "RemoveContainer" containerID="e6eec2c2e234b2d430a42c5454c9684cdf163c8126ee2222519cd4b6db36263b" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.546468 4574 scope.go:117] "RemoveContainer" containerID="069df0229c63165d1cf180b877fb44c309fea2b96410dffa33f1b9c2f38562f9" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.565572 4574 scope.go:117] "RemoveContainer" containerID="2a3bae4e2a51ef376c752e5f90e6a3bbea5aaf3cd5c5f3a86c4c41af0401f83f" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623568 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623611 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623634 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-scripts\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623668 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-run-httpd\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623721 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jb2\" (UniqueName: \"kubernetes.io/projected/3a604386-97b7-4ced-889a-f414b194db50-kube-api-access-q2jb2\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623764 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623796 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-config-data\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623827 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623858 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623884 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623925 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623942 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623974 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-log-httpd\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.623999 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq4rr\" (UniqueName: \"kubernetes.io/projected/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-kube-api-access-hq4rr\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.624035 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.624354 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.624974 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.625183 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-log-httpd\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.625264 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-run-httpd\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.625612 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.632871 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.636678 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.643001 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.644665 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-scripts\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.645725 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.648628 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-config-data\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.649578 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.650130 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.654845 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jb2\" (UniqueName: \"kubernetes.io/projected/3a604386-97b7-4ced-889a-f414b194db50-kube-api-access-q2jb2\") pod \"ceilometer-0\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.656220 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq4rr\" (UniqueName: \"kubernetes.io/projected/18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7-kube-api-access-hq4rr\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.677379 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7\") " pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.734136 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.747172 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af57198a-6432-454b-ab0f-6e07c76f166b" path="/var/lib/kubelet/pods/af57198a-6432-454b-ab0f-6e07c76f166b/volumes" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.748489 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:04 crc kubenswrapper[4574]: I1004 05:07:04.748864 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b68347-847d-406d-a6c9-b99884baf6c0" path="/var/lib/kubelet/pods/f1b68347-847d-406d-a6c9-b99884baf6c0/volumes" Oct 04 05:07:05 crc kubenswrapper[4574]: I1004 05:07:05.274131 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:05 crc kubenswrapper[4574]: I1004 05:07:05.339224 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerStarted","Data":"905191aa914bc62598ed1a521a9bda50e16bb424347d49cf3f228d9063e8bac3"} Oct 04 05:07:05 crc kubenswrapper[4574]: I1004 05:07:05.473357 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 04 05:07:06 crc kubenswrapper[4574]: I1004 05:07:06.356626 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7","Type":"ContainerStarted","Data":"4b39ce8cda7c6257598903613869fac5dc0c3577bcfed625d64e9640df9b8373"} Oct 04 05:07:07 crc kubenswrapper[4574]: I1004 05:07:07.366327 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerStarted","Data":"33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606"} Oct 04 05:07:07 crc kubenswrapper[4574]: I1004 05:07:07.368036 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7","Type":"ContainerStarted","Data":"7069eacb7c1d2e16067c314bd7fc0c21fec887013e64446758b1b84b08ae2a05"} Oct 04 05:07:08 crc kubenswrapper[4574]: I1004 05:07:08.379092 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7","Type":"ContainerStarted","Data":"77dcb7342dc6e608884165c8e75984967f16d8c68a2e69b6a640dd835774cb59"} Oct 04 05:07:08 crc kubenswrapper[4574]: I1004 05:07:08.408616 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.408598361 podStartE2EDuration="4.408598361s" podCreationTimestamp="2025-10-04 05:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:08.397275359 +0000 UTC m=+1254.251418401" watchObservedRunningTime="2025-10-04 05:07:08.408598361 +0000 UTC m=+1254.262741403" Oct 04 05:07:08 crc kubenswrapper[4574]: I1004 05:07:08.570870 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 04 05:07:08 crc kubenswrapper[4574]: I1004 05:07:08.571363 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 04 05:07:08 crc kubenswrapper[4574]: I1004 05:07:08.599035 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 04 05:07:08 crc kubenswrapper[4574]: I1004 05:07:08.612434 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 04 05:07:09 crc kubenswrapper[4574]: I1004 05:07:09.390073 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerStarted","Data":"6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75"} Oct 04 05:07:09 crc kubenswrapper[4574]: I1004 05:07:09.390477 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 04 05:07:09 crc kubenswrapper[4574]: I1004 05:07:09.390517 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.402677 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerStarted","Data":"d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad"} Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.406438 4574 generic.go:334] "Generic (PLEG): container finished" podID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerID="6d70675baea48ecf302727fcdbee1acb4f0596c28e0e43a0d20ca60fc171a6ba" exitCode=137 Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.406499 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerDied","Data":"6d70675baea48ecf302727fcdbee1acb4f0596c28e0e43a0d20ca60fc171a6ba"} Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.406545 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerStarted","Data":"403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469"} Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.406565 4574 scope.go:117] "RemoveContainer" containerID="3e4dc4fc365b9ba947066873c9c3d152cb971dadf939b36d9d774912264c3816" Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.410128 4574 generic.go:334] "Generic (PLEG): container finished" podID="85281a42-f9ab-4302-9fe9-4e742075530f" containerID="7e11226527b91eec1e02086808d22606900e396ca4000781b4ed8449905f45e8" exitCode=137 Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.411035 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerDied","Data":"7e11226527b91eec1e02086808d22606900e396ca4000781b4ed8449905f45e8"} Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.411085 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bfb4d496-nv6hv" event={"ID":"85281a42-f9ab-4302-9fe9-4e742075530f","Type":"ContainerStarted","Data":"9731d0f2d510d1455fb179e9f43a1ff8756fbdc5e7a95005a3ec5f12d2955612"} Oct 04 05:07:10 crc kubenswrapper[4574]: I1004 05:07:10.626668 4574 scope.go:117] "RemoveContainer" containerID="bafa808bdf2a35dee0e61ef90e7b8b4999e39f07d3cde96c5386527343a5b987" Oct 04 05:07:12 crc kubenswrapper[4574]: I1004 05:07:12.312049 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 04 05:07:12 crc kubenswrapper[4574]: I1004 05:07:12.312670 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 05:07:12 crc kubenswrapper[4574]: I1004 05:07:12.454737 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerStarted","Data":"cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970"} Oct 04 05:07:12 crc kubenswrapper[4574]: I1004 05:07:12.456118 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:07:12 crc kubenswrapper[4574]: I1004 05:07:12.481310 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.685785325 podStartE2EDuration="8.481295946s" podCreationTimestamp="2025-10-04 05:07:04 +0000 UTC" firstStartedPulling="2025-10-04 05:07:05.294076955 +0000 UTC m=+1251.148219997" lastFinishedPulling="2025-10-04 05:07:11.089587576 +0000 UTC m=+1256.943730618" observedRunningTime="2025-10-04 05:07:12.477103397 +0000 UTC m=+1258.331246439" watchObservedRunningTime="2025-10-04 05:07:12.481295946 +0000 UTC m=+1258.335438988" Oct 04 05:07:13 crc kubenswrapper[4574]: I1004 05:07:13.051597 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 04 05:07:13 crc kubenswrapper[4574]: I1004 05:07:13.464684 4574 generic.go:334] "Generic (PLEG): container finished" podID="232b9769-2677-4ce8-991e-a8b94b2e5de1" containerID="6d46930556fab887ac8159f85f1d1236bb3505d414e9dbb9f4a82b03c276d8f5" exitCode=0 Oct 04 05:07:13 crc kubenswrapper[4574]: I1004 05:07:13.465858 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rzg59" event={"ID":"232b9769-2677-4ce8-991e-a8b94b2e5de1","Type":"ContainerDied","Data":"6d46930556fab887ac8159f85f1d1236bb3505d414e9dbb9f4a82b03c276d8f5"} Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.751561 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.752002 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.806713 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.825485 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.845669 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-config-data\") pod \"232b9769-2677-4ce8-991e-a8b94b2e5de1\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.845720 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-combined-ca-bundle\") pod \"232b9769-2677-4ce8-991e-a8b94b2e5de1\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.845800 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-scripts\") pod \"232b9769-2677-4ce8-991e-a8b94b2e5de1\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.845916 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkd27\" (UniqueName: \"kubernetes.io/projected/232b9769-2677-4ce8-991e-a8b94b2e5de1-kube-api-access-qkd27\") pod \"232b9769-2677-4ce8-991e-a8b94b2e5de1\" (UID: \"232b9769-2677-4ce8-991e-a8b94b2e5de1\") " Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.860642 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232b9769-2677-4ce8-991e-a8b94b2e5de1-kube-api-access-qkd27" (OuterVolumeSpecName: "kube-api-access-qkd27") pod "232b9769-2677-4ce8-991e-a8b94b2e5de1" (UID: "232b9769-2677-4ce8-991e-a8b94b2e5de1"). InnerVolumeSpecName "kube-api-access-qkd27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.861381 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.867618 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-scripts" (OuterVolumeSpecName: "scripts") pod "232b9769-2677-4ce8-991e-a8b94b2e5de1" (UID: "232b9769-2677-4ce8-991e-a8b94b2e5de1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.948741 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkd27\" (UniqueName: \"kubernetes.io/projected/232b9769-2677-4ce8-991e-a8b94b2e5de1-kube-api-access-qkd27\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.948777 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.950806 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-config-data" (OuterVolumeSpecName: "config-data") pod "232b9769-2677-4ce8-991e-a8b94b2e5de1" (UID: "232b9769-2677-4ce8-991e-a8b94b2e5de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:14 crc kubenswrapper[4574]: I1004 05:07:14.961476 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "232b9769-2677-4ce8-991e-a8b94b2e5de1" (UID: "232b9769-2677-4ce8-991e-a8b94b2e5de1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.050007 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.050051 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232b9769-2677-4ce8-991e-a8b94b2e5de1-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.485535 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rzg59" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.485525 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rzg59" event={"ID":"232b9769-2677-4ce8-991e-a8b94b2e5de1","Type":"ContainerDied","Data":"90903947d96184f3f9b7a0ca3b01e030387d5fdc0946ab4f714bf5a009083e3b"} Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.486246 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90903947d96184f3f9b7a0ca3b01e030387d5fdc0946ab4f714bf5a009083e3b" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.486278 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.486295 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.617749 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 04 05:07:15 crc kubenswrapper[4574]: E1004 05:07:15.618254 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232b9769-2677-4ce8-991e-a8b94b2e5de1" containerName="nova-cell0-conductor-db-sync" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.618277 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="232b9769-2677-4ce8-991e-a8b94b2e5de1" containerName="nova-cell0-conductor-db-sync" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.618517 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="232b9769-2677-4ce8-991e-a8b94b2e5de1" containerName="nova-cell0-conductor-db-sync" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.619292 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.625183 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.634010 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.634172 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bl8s2" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.768206 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vhxp\" (UniqueName: \"kubernetes.io/projected/8377c768-d10d-49d6-b43f-b1aeedcdeae6-kube-api-access-4vhxp\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.768344 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8377c768-d10d-49d6-b43f-b1aeedcdeae6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.768390 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8377c768-d10d-49d6-b43f-b1aeedcdeae6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.870165 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8377c768-d10d-49d6-b43f-b1aeedcdeae6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.871086 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8377c768-d10d-49d6-b43f-b1aeedcdeae6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.871351 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vhxp\" (UniqueName: \"kubernetes.io/projected/8377c768-d10d-49d6-b43f-b1aeedcdeae6-kube-api-access-4vhxp\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.885097 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8377c768-d10d-49d6-b43f-b1aeedcdeae6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.885903 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8377c768-d10d-49d6-b43f-b1aeedcdeae6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.902868 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vhxp\" (UniqueName: \"kubernetes.io/projected/8377c768-d10d-49d6-b43f-b1aeedcdeae6-kube-api-access-4vhxp\") pod \"nova-cell0-conductor-0\" (UID: \"8377c768-d10d-49d6-b43f-b1aeedcdeae6\") " pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:15 crc kubenswrapper[4574]: I1004 05:07:15.949180 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:16 crc kubenswrapper[4574]: I1004 05:07:16.510438 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 04 05:07:16 crc kubenswrapper[4574]: I1004 05:07:16.749021 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:16 crc kubenswrapper[4574]: I1004 05:07:16.749680 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-central-agent" containerID="cri-o://33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606" gracePeriod=30 Oct 04 05:07:16 crc kubenswrapper[4574]: I1004 05:07:16.749904 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="proxy-httpd" containerID="cri-o://cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970" gracePeriod=30 Oct 04 05:07:16 crc kubenswrapper[4574]: I1004 05:07:16.750103 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="sg-core" containerID="cri-o://d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad" gracePeriod=30 Oct 04 05:07:16 crc kubenswrapper[4574]: I1004 05:07:16.750161 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-notification-agent" containerID="cri-o://6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75" gracePeriod=30 Oct 04 05:07:17 crc kubenswrapper[4574]: E1004 05:07:17.513977 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a604386_97b7_4ced_889a_f414b194db50.slice/crio-6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.516717 4574 generic.go:334] "Generic (PLEG): container finished" podID="3a604386-97b7-4ced-889a-f414b194db50" containerID="cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970" exitCode=0 Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.516823 4574 generic.go:334] "Generic (PLEG): container finished" podID="3a604386-97b7-4ced-889a-f414b194db50" containerID="d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad" exitCode=2 Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.516835 4574 generic.go:334] "Generic (PLEG): container finished" podID="3a604386-97b7-4ced-889a-f414b194db50" containerID="6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75" exitCode=0 Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.516772 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerDied","Data":"cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970"} Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.516907 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerDied","Data":"d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad"} Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.516922 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerDied","Data":"6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75"} Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.519391 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8377c768-d10d-49d6-b43f-b1aeedcdeae6","Type":"ContainerStarted","Data":"9f913ca4181f3df2eeb89b77cfd234f66196b9a986b88d22f726b6c6a4981ad4"} Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.519452 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8377c768-d10d-49d6-b43f-b1aeedcdeae6","Type":"ContainerStarted","Data":"76d65ac574064e14e0736c2db8961db871c12818a90bc47bd758d6238c406e6c"} Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.519571 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.542749 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.542730309 podStartE2EDuration="2.542730309s" podCreationTimestamp="2025-10-04 05:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:17.538788717 +0000 UTC m=+1263.392931769" watchObservedRunningTime="2025-10-04 05:07:17.542730309 +0000 UTC m=+1263.396873351" Oct 04 05:07:17 crc kubenswrapper[4574]: I1004 05:07:17.980163 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.030529 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-scripts\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.030882 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-run-httpd\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.030999 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-sg-core-conf-yaml\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.031082 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2jb2\" (UniqueName: \"kubernetes.io/projected/3a604386-97b7-4ced-889a-f414b194db50-kube-api-access-q2jb2\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.031218 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-config-data\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.031425 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-log-httpd\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.031491 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-combined-ca-bundle\") pod \"3a604386-97b7-4ced-889a-f414b194db50\" (UID: \"3a604386-97b7-4ced-889a-f414b194db50\") " Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.036279 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.037225 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.044692 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-scripts" (OuterVolumeSpecName: "scripts") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.045436 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a604386-97b7-4ced-889a-f414b194db50-kube-api-access-q2jb2" (OuterVolumeSpecName: "kube-api-access-q2jb2") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "kube-api-access-q2jb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.129384 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.136592 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.137220 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.137247 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a604386-97b7-4ced-889a-f414b194db50-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.137258 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.137270 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2jb2\" (UniqueName: \"kubernetes.io/projected/3a604386-97b7-4ced-889a-f414b194db50-kube-api-access-q2jb2\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.157377 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.225454 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-config-data" (OuterVolumeSpecName: "config-data") pod "3a604386-97b7-4ced-889a-f414b194db50" (UID: "3a604386-97b7-4ced-889a-f414b194db50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.239090 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.239129 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a604386-97b7-4ced-889a-f414b194db50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.529610 4574 generic.go:334] "Generic (PLEG): container finished" podID="3a604386-97b7-4ced-889a-f414b194db50" containerID="33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606" exitCode=0 Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.529688 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.529740 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerDied","Data":"33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606"} Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.529782 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a604386-97b7-4ced-889a-f414b194db50","Type":"ContainerDied","Data":"905191aa914bc62598ed1a521a9bda50e16bb424347d49cf3f228d9063e8bac3"} Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.529803 4574 scope.go:117] "RemoveContainer" containerID="cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.597370 4574 scope.go:117] "RemoveContainer" containerID="d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.601482 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.610261 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.631207 4574 scope.go:117] "RemoveContainer" containerID="6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.648988 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.649374 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-central-agent" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649390 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-central-agent" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.649408 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-notification-agent" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649415 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-notification-agent" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.649426 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="proxy-httpd" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649432 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="proxy-httpd" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.649460 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="sg-core" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649466 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="sg-core" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649637 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-central-agent" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649660 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="sg-core" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649675 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="ceilometer-notification-agent" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.649685 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a604386-97b7-4ced-889a-f414b194db50" containerName="proxy-httpd" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.656701 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.659560 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.660062 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.663301 4574 scope.go:117] "RemoveContainer" containerID="33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.667019 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.710364 4574 scope.go:117] "RemoveContainer" containerID="cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.710820 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970\": container with ID starting with cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970 not found: ID does not exist" containerID="cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.710845 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970"} err="failed to get container status \"cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970\": rpc error: code = NotFound desc = could not find container \"cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970\": container with ID starting with cb28b7172d8e321c50186f27988be3f3e3b4ec4d313ea0ed92e1c0f7fe07f970 not found: ID does not exist" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.710865 4574 scope.go:117] "RemoveContainer" containerID="d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.711152 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad\": container with ID starting with d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad not found: ID does not exist" containerID="d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.711170 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad"} err="failed to get container status \"d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad\": rpc error: code = NotFound desc = could not find container \"d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad\": container with ID starting with d4b467672838ceb0c985183ccb4285834f32ad6f3c7d9e3bf70000cc288f2dad not found: ID does not exist" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.711182 4574 scope.go:117] "RemoveContainer" containerID="6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.711454 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75\": container with ID starting with 6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75 not found: ID does not exist" containerID="6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.711470 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75"} err="failed to get container status \"6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75\": rpc error: code = NotFound desc = could not find container \"6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75\": container with ID starting with 6fb2b1b8972e8e2abcdb877dcfefb23e75183b712b42f6331f0805722fc94f75 not found: ID does not exist" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.711482 4574 scope.go:117] "RemoveContainer" containerID="33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606" Oct 04 05:07:18 crc kubenswrapper[4574]: E1004 05:07:18.711647 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606\": container with ID starting with 33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606 not found: ID does not exist" containerID="33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.711673 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606"} err="failed to get container status \"33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606\": rpc error: code = NotFound desc = could not find container \"33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606\": container with ID starting with 33dc360375e39fc75cd3aa347734dc296656b1d3ff7b7705e7097bfb13b24606 not found: ID does not exist" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.743615 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a604386-97b7-4ced-889a-f414b194db50" path="/var/lib/kubelet/pods/3a604386-97b7-4ced-889a-f414b194db50/volumes" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.748427 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-log-httpd\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.748487 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5lfs\" (UniqueName: \"kubernetes.io/projected/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-kube-api-access-g5lfs\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.748611 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-run-httpd\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.749407 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-config-data\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.749473 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-scripts\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.749682 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.752760 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.854214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.854293 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-log-httpd\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.854319 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5lfs\" (UniqueName: \"kubernetes.io/projected/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-kube-api-access-g5lfs\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.854360 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-run-httpd\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.855117 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-config-data\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.855145 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-scripts\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.855303 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.855859 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-run-httpd\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.855904 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-log-httpd\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.859716 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-scripts\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.859929 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.862824 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.863113 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-config-data\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.875647 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5lfs\" (UniqueName: \"kubernetes.io/projected/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-kube-api-access-g5lfs\") pod \"ceilometer-0\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " pod="openstack/ceilometer-0" Oct 04 05:07:18 crc kubenswrapper[4574]: I1004 05:07:18.980565 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.535800 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.710300 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.710611 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.833140 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.833283 4574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.840925 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:07:19 crc kubenswrapper[4574]: I1004 05:07:19.842149 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:07:20 crc kubenswrapper[4574]: I1004 05:07:20.183940 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 04 05:07:20 crc kubenswrapper[4574]: I1004 05:07:20.556088 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerStarted","Data":"bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474"} Oct 04 05:07:20 crc kubenswrapper[4574]: I1004 05:07:20.556136 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerStarted","Data":"1c6257107fdb1ade05bf2e6d7dc6c20d2548fff19c1401002ecfc0862a7cc982"} Oct 04 05:07:21 crc kubenswrapper[4574]: I1004 05:07:21.568606 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerStarted","Data":"873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260"} Oct 04 05:07:22 crc kubenswrapper[4574]: I1004 05:07:22.579797 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerStarted","Data":"ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725"} Oct 04 05:07:23 crc kubenswrapper[4574]: I1004 05:07:23.589078 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerStarted","Data":"5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012"} Oct 04 05:07:23 crc kubenswrapper[4574]: I1004 05:07:23.590669 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:07:23 crc kubenswrapper[4574]: I1004 05:07:23.611568 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.952407509 podStartE2EDuration="5.61154953s" podCreationTimestamp="2025-10-04 05:07:18 +0000 UTC" firstStartedPulling="2025-10-04 05:07:19.551179196 +0000 UTC m=+1265.405322238" lastFinishedPulling="2025-10-04 05:07:23.210321217 +0000 UTC m=+1269.064464259" observedRunningTime="2025-10-04 05:07:23.609419649 +0000 UTC m=+1269.463562691" watchObservedRunningTime="2025-10-04 05:07:23.61154953 +0000 UTC m=+1269.465692572" Oct 04 05:07:25 crc kubenswrapper[4574]: I1004 05:07:25.985423 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 04 05:07:26 crc kubenswrapper[4574]: I1004 05:07:26.962108 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dxjss"] Oct 04 05:07:26 crc kubenswrapper[4574]: I1004 05:07:26.964407 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:26 crc kubenswrapper[4574]: I1004 05:07:26.968616 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 04 05:07:26 crc kubenswrapper[4574]: I1004 05:07:26.969207 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.041622 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dxjss"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.152887 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-config-data\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.152972 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.153024 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-scripts\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.153089 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wgn\" (UniqueName: \"kubernetes.io/projected/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-kube-api-access-n6wgn\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.191956 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.194270 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.199704 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.217462 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.255636 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.257436 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.271077 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-scripts\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.271223 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6wgn\" (UniqueName: \"kubernetes.io/projected/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-kube-api-access-n6wgn\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.271706 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-config-data\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.271776 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.272815 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.320996 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-config-data\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.329756 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-scripts\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.331852 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.347301 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381177 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381344 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381461 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwj7\" (UniqueName: \"kubernetes.io/projected/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-kube-api-access-hwwj7\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381505 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-logs\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381553 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381595 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxjd\" (UniqueName: \"kubernetes.io/projected/ae90061f-8906-44b4-8195-286492c8d770-kube-api-access-6hxjd\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.381640 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-config-data\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.429580 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6wgn\" (UniqueName: \"kubernetes.io/projected/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-kube-api-access-n6wgn\") pod \"nova-cell0-cell-mapping-dxjss\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.433024 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.442842 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.468759 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.510746 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwj7\" (UniqueName: \"kubernetes.io/projected/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-kube-api-access-hwwj7\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.522433 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-logs\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.522776 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.522946 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxjd\" (UniqueName: \"kubernetes.io/projected/ae90061f-8906-44b4-8195-286492c8d770-kube-api-access-6hxjd\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.523121 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-config-data\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.523368 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.523614 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.533641 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.539110 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-logs\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.558422 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.565370 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-config-data\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.590366 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.600465 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.615042 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxjd\" (UniqueName: \"kubernetes.io/projected/ae90061f-8906-44b4-8195-286492c8d770-kube-api-access-6hxjd\") pod \"nova-cell1-novncproxy-0\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.646563 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-config-data\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.707565 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phqz\" (UniqueName: \"kubernetes.io/projected/bf330144-5f7d-44a5-ae98-85860c9d5ce5-kube-api-access-7phqz\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.708543 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.650156 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwj7\" (UniqueName: \"kubernetes.io/projected/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-kube-api-access-hwwj7\") pod \"nova-api-0\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.654709 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.720765 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.722842 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.728621 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.763436 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.816583 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.817055 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf11c800-77f4-491f-92da-13efb6916962-logs\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.817187 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-config-data\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.817337 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k2t2\" (UniqueName: \"kubernetes.io/projected/cf11c800-77f4-491f-92da-13efb6916962-kube-api-access-8k2t2\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.817557 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phqz\" (UniqueName: \"kubernetes.io/projected/bf330144-5f7d-44a5-ae98-85860c9d5ce5-kube-api-access-7phqz\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.821686 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.821862 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-config-data\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.824554 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.835906 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-config-data\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.841707 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.888931 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6ptbq"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.891133 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.898149 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phqz\" (UniqueName: \"kubernetes.io/projected/bf330144-5f7d-44a5-ae98-85860c9d5ce5-kube-api-access-7phqz\") pod \"nova-scheduler-0\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.899011 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.929058 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf11c800-77f4-491f-92da-13efb6916962-logs\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.929103 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k2t2\" (UniqueName: \"kubernetes.io/projected/cf11c800-77f4-491f-92da-13efb6916962-kube-api-access-8k2t2\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.929178 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-config-data\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.929256 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.931144 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf11c800-77f4-491f-92da-13efb6916962-logs\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.937446 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6ptbq"] Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.938498 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.940148 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-config-data\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.952089 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k2t2\" (UniqueName: \"kubernetes.io/projected/cf11c800-77f4-491f-92da-13efb6916962-kube-api-access-8k2t2\") pod \"nova-metadata-0\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " pod="openstack/nova-metadata-0" Oct 04 05:07:27 crc kubenswrapper[4574]: I1004 05:07:27.988164 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.037017 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-config\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.037551 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.038433 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.038514 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.056282 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgb74\" (UniqueName: \"kubernetes.io/projected/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-kube-api-access-kgb74\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.056406 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.097840 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.161963 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.162033 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.162099 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgb74\" (UniqueName: \"kubernetes.io/projected/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-kube-api-access-kgb74\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.162124 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.162243 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-config\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.162266 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.163271 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.163942 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.164941 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.165744 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.176633 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-config\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.364373 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgb74\" (UniqueName: \"kubernetes.io/projected/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-kube-api-access-kgb74\") pod \"dnsmasq-dns-845d6d6f59-6ptbq\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.528620 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.669477 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dxjss"] Oct 04 05:07:28 crc kubenswrapper[4574]: I1004 05:07:28.958057 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.261434 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.483830 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.540252 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.716429 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.750903 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf330144-5f7d-44a5-ae98-85860c9d5ce5","Type":"ContainerStarted","Data":"f07182c5f77060230ddd49741922e3c55f7aa82973ed534d517ada58466d7527"} Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.763390 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dxjss" event={"ID":"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25","Type":"ContainerStarted","Data":"bad33cd36963b5cbff962034b149243e97d7fa047dda6a2a802693e852b83b21"} Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.768311 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf11c800-77f4-491f-92da-13efb6916962","Type":"ContainerStarted","Data":"e14e465968c9a95030fab2032d0f5eae94b6eb2bb89815045ce8a16ff92d968a"} Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.770680 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ae90061f-8906-44b4-8195-286492c8d770","Type":"ContainerStarted","Data":"bbddb2fafce520744812b62521a1515d4b5418309521217dc8284bdc346f9d16"} Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.772347 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b","Type":"ContainerStarted","Data":"e7910f258bded4499b06e20bc3ad5595c19d330b98e1ba1562a5da4a042c6012"} Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.848200 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:07:29 crc kubenswrapper[4574]: I1004 05:07:29.950876 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6ptbq"] Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.654167 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4h6wk"] Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.655722 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.664602 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.664789 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.699762 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4h6wk"] Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.781981 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-scripts\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.782804 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.783315 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7r86\" (UniqueName: \"kubernetes.io/projected/9f813f9d-cacd-47ec-9f90-889f59e98949-kube-api-access-c7r86\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.783533 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-config-data\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.801122 4574 generic.go:334] "Generic (PLEG): container finished" podID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerID="6eecca9648b3983501de382fce85b978a00b2d2501345cea2d00a2dfc63d1ccb" exitCode=0 Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.801468 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" event={"ID":"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567","Type":"ContainerDied","Data":"6eecca9648b3983501de382fce85b978a00b2d2501345cea2d00a2dfc63d1ccb"} Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.801629 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" event={"ID":"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567","Type":"ContainerStarted","Data":"e021d66526ada5f7c37282d90bbc5d6047f6b2b1f9d65fd1c921a3ac36869071"} Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.851650 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dxjss" event={"ID":"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25","Type":"ContainerStarted","Data":"fde8a19cb416d9c2c9e7e3823e81f0ffe0b062d645c96114e137d08076b30e1f"} Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.892533 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.893520 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7r86\" (UniqueName: \"kubernetes.io/projected/9f813f9d-cacd-47ec-9f90-889f59e98949-kube-api-access-c7r86\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.893828 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-config-data\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.894537 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-scripts\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.901002 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-scripts\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.917442 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-config-data\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.918924 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.940711 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7r86\" (UniqueName: \"kubernetes.io/projected/9f813f9d-cacd-47ec-9f90-889f59e98949-kube-api-access-c7r86\") pod \"nova-cell1-conductor-db-sync-4h6wk\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.941932 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dxjss" podStartSLOduration=4.941916105 podStartE2EDuration="4.941916105s" podCreationTimestamp="2025-10-04 05:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:30.935326348 +0000 UTC m=+1276.789469400" watchObservedRunningTime="2025-10-04 05:07:30.941916105 +0000 UTC m=+1276.796059147" Oct 04 05:07:30 crc kubenswrapper[4574]: I1004 05:07:30.985584 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:31 crc kubenswrapper[4574]: I1004 05:07:31.827476 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4h6wk"] Oct 04 05:07:31 crc kubenswrapper[4574]: W1004 05:07:31.868488 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f813f9d_cacd_47ec_9f90_889f59e98949.slice/crio-aeaa4e396fcbd914073cba3f7adeeaa2bbd17fd11f405becaa4450c7abc69805 WatchSource:0}: Error finding container aeaa4e396fcbd914073cba3f7adeeaa2bbd17fd11f405becaa4450c7abc69805: Status 404 returned error can't find the container with id aeaa4e396fcbd914073cba3f7adeeaa2bbd17fd11f405becaa4450c7abc69805 Oct 04 05:07:31 crc kubenswrapper[4574]: I1004 05:07:31.912733 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" event={"ID":"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567","Type":"ContainerStarted","Data":"8725e41c53a57855ab39facb6b009c34399e339c812891ca1ab473621e3e958c"} Oct 04 05:07:31 crc kubenswrapper[4574]: I1004 05:07:31.912791 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:31 crc kubenswrapper[4574]: I1004 05:07:31.938602 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" podStartSLOduration=4.938587449 podStartE2EDuration="4.938587449s" podCreationTimestamp="2025-10-04 05:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:31.937666153 +0000 UTC m=+1277.791809195" watchObservedRunningTime="2025-10-04 05:07:31.938587449 +0000 UTC m=+1277.792730481" Oct 04 05:07:32 crc kubenswrapper[4574]: I1004 05:07:32.906507 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:32 crc kubenswrapper[4574]: I1004 05:07:32.921414 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" event={"ID":"9f813f9d-cacd-47ec-9f90-889f59e98949","Type":"ContainerStarted","Data":"df47e842772177fbb92cc2c8c3479bd7afc8882a380feeba9001cf3c22dbe611"} Oct 04 05:07:32 crc kubenswrapper[4574]: I1004 05:07:32.921478 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" event={"ID":"9f813f9d-cacd-47ec-9f90-889f59e98949","Type":"ContainerStarted","Data":"aeaa4e396fcbd914073cba3f7adeeaa2bbd17fd11f405becaa4450c7abc69805"} Oct 04 05:07:32 crc kubenswrapper[4574]: I1004 05:07:32.934462 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:07:32 crc kubenswrapper[4574]: I1004 05:07:32.995464 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" podStartSLOduration=2.995440716 podStartE2EDuration="2.995440716s" podCreationTimestamp="2025-10-04 05:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:32.945798633 +0000 UTC m=+1278.799941675" watchObservedRunningTime="2025-10-04 05:07:32.995440716 +0000 UTC m=+1278.849583758" Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.978031 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf330144-5f7d-44a5-ae98-85860c9d5ce5","Type":"ContainerStarted","Data":"8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76"} Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.983612 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf11c800-77f4-491f-92da-13efb6916962","Type":"ContainerStarted","Data":"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd"} Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.983659 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf11c800-77f4-491f-92da-13efb6916962","Type":"ContainerStarted","Data":"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739"} Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.983785 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-log" containerID="cri-o://d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739" gracePeriod=30 Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.983900 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-metadata" containerID="cri-o://a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd" gracePeriod=30 Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.987541 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ae90061f-8906-44b4-8195-286492c8d770","Type":"ContainerStarted","Data":"6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d"} Oct 04 05:07:36 crc kubenswrapper[4574]: I1004 05:07:36.987597 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ae90061f-8906-44b4-8195-286492c8d770" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d" gracePeriod=30 Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.007464 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b","Type":"ContainerStarted","Data":"f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39"} Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.007512 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b","Type":"ContainerStarted","Data":"8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080"} Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.012165 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.8113806180000003 podStartE2EDuration="10.012106225s" podCreationTimestamp="2025-10-04 05:07:27 +0000 UTC" firstStartedPulling="2025-10-04 05:07:29.500778748 +0000 UTC m=+1275.354921790" lastFinishedPulling="2025-10-04 05:07:35.701504355 +0000 UTC m=+1281.555647397" observedRunningTime="2025-10-04 05:07:37.000415142 +0000 UTC m=+1282.854558194" watchObservedRunningTime="2025-10-04 05:07:37.012106225 +0000 UTC m=+1282.866249267" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.034614 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.6680519179999997 podStartE2EDuration="10.034571935s" podCreationTimestamp="2025-10-04 05:07:27 +0000 UTC" firstStartedPulling="2025-10-04 05:07:29.330389427 +0000 UTC m=+1275.184532469" lastFinishedPulling="2025-10-04 05:07:35.696909444 +0000 UTC m=+1281.551052486" observedRunningTime="2025-10-04 05:07:37.023583612 +0000 UTC m=+1282.877726654" watchObservedRunningTime="2025-10-04 05:07:37.034571935 +0000 UTC m=+1282.888714997" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.064508 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.980355638 podStartE2EDuration="10.064482326s" podCreationTimestamp="2025-10-04 05:07:27 +0000 UTC" firstStartedPulling="2025-10-04 05:07:29.612989942 +0000 UTC m=+1275.467132984" lastFinishedPulling="2025-10-04 05:07:35.69711663 +0000 UTC m=+1281.551259672" observedRunningTime="2025-10-04 05:07:37.052897507 +0000 UTC m=+1282.907040559" watchObservedRunningTime="2025-10-04 05:07:37.064482326 +0000 UTC m=+1282.918625368" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.081210 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.344227309 podStartE2EDuration="10.081192602s" podCreationTimestamp="2025-10-04 05:07:27 +0000 UTC" firstStartedPulling="2025-10-04 05:07:28.96235052 +0000 UTC m=+1274.816493562" lastFinishedPulling="2025-10-04 05:07:35.699315813 +0000 UTC m=+1281.553458855" observedRunningTime="2025-10-04 05:07:37.074520432 +0000 UTC m=+1282.928663474" watchObservedRunningTime="2025-10-04 05:07:37.081192602 +0000 UTC m=+1282.935335644" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.755006 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.799945 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-combined-ca-bundle\") pod \"cf11c800-77f4-491f-92da-13efb6916962\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.814494 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-config-data\") pod \"cf11c800-77f4-491f-92da-13efb6916962\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.814728 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k2t2\" (UniqueName: \"kubernetes.io/projected/cf11c800-77f4-491f-92da-13efb6916962-kube-api-access-8k2t2\") pod \"cf11c800-77f4-491f-92da-13efb6916962\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.814795 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf11c800-77f4-491f-92da-13efb6916962-logs\") pod \"cf11c800-77f4-491f-92da-13efb6916962\" (UID: \"cf11c800-77f4-491f-92da-13efb6916962\") " Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.816048 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf11c800-77f4-491f-92da-13efb6916962-logs" (OuterVolumeSpecName: "logs") pod "cf11c800-77f4-491f-92da-13efb6916962" (UID: "cf11c800-77f4-491f-92da-13efb6916962"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.825373 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf11c800-77f4-491f-92da-13efb6916962-kube-api-access-8k2t2" (OuterVolumeSpecName: "kube-api-access-8k2t2") pod "cf11c800-77f4-491f-92da-13efb6916962" (UID: "cf11c800-77f4-491f-92da-13efb6916962"). InnerVolumeSpecName "kube-api-access-8k2t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.829513 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.829577 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.845357 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf11c800-77f4-491f-92da-13efb6916962" (UID: "cf11c800-77f4-491f-92da-13efb6916962"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.877446 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-config-data" (OuterVolumeSpecName: "config-data") pod "cf11c800-77f4-491f-92da-13efb6916962" (UID: "cf11c800-77f4-491f-92da-13efb6916962"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.901356 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.918036 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k2t2\" (UniqueName: \"kubernetes.io/projected/cf11c800-77f4-491f-92da-13efb6916962-kube-api-access-8k2t2\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.918096 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf11c800-77f4-491f-92da-13efb6916962-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.918108 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.918118 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf11c800-77f4-491f-92da-13efb6916962-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.989776 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 04 05:07:37 crc kubenswrapper[4574]: I1004 05:07:37.989830 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.036781 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.071583 4574 generic.go:334] "Generic (PLEG): container finished" podID="cf11c800-77f4-491f-92da-13efb6916962" containerID="a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd" exitCode=0 Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.071616 4574 generic.go:334] "Generic (PLEG): container finished" podID="cf11c800-77f4-491f-92da-13efb6916962" containerID="d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739" exitCode=143 Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.071661 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf11c800-77f4-491f-92da-13efb6916962","Type":"ContainerDied","Data":"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd"} Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.071743 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf11c800-77f4-491f-92da-13efb6916962","Type":"ContainerDied","Data":"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739"} Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.071762 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf11c800-77f4-491f-92da-13efb6916962","Type":"ContainerDied","Data":"e14e465968c9a95030fab2032d0f5eae94b6eb2bb89815045ce8a16ff92d968a"} Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.071806 4574 scope.go:117] "RemoveContainer" containerID="a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.072088 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.124571 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.149901 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.158228 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.162436 4574 scope.go:117] "RemoveContainer" containerID="d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.195356 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:38 crc kubenswrapper[4574]: E1004 05:07:38.195863 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-log" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.195883 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-log" Oct 04 05:07:38 crc kubenswrapper[4574]: E1004 05:07:38.195913 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-metadata" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.195921 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-metadata" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.196888 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-metadata" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.196919 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf11c800-77f4-491f-92da-13efb6916962" containerName="nova-metadata-log" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.198449 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.207126 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.207373 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.226383 4574 scope.go:117] "RemoveContainer" containerID="a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd" Oct 04 05:07:38 crc kubenswrapper[4574]: E1004 05:07:38.232508 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd\": container with ID starting with a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd not found: ID does not exist" containerID="a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.232567 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd"} err="failed to get container status \"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd\": rpc error: code = NotFound desc = could not find container \"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd\": container with ID starting with a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd not found: ID does not exist" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.232601 4574 scope.go:117] "RemoveContainer" containerID="d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.232927 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.233000 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-config-data\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.233131 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6686064d-71cc-4b0d-bcd9-7befbfb27541-logs\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.233172 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.233283 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgrl\" (UniqueName: \"kubernetes.io/projected/6686064d-71cc-4b0d-bcd9-7befbfb27541-kube-api-access-pwgrl\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: E1004 05:07:38.241407 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739\": container with ID starting with d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739 not found: ID does not exist" containerID="d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.241463 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739"} err="failed to get container status \"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739\": rpc error: code = NotFound desc = could not find container \"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739\": container with ID starting with d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739 not found: ID does not exist" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.241497 4574 scope.go:117] "RemoveContainer" containerID="a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.242871 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd"} err="failed to get container status \"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd\": rpc error: code = NotFound desc = could not find container \"a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd\": container with ID starting with a8e1261a5632b2d18932772f45591900f6b31501f54a90f5d2b5f7db76c314bd not found: ID does not exist" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.242910 4574 scope.go:117] "RemoveContainer" containerID="d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.247391 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739"} err="failed to get container status \"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739\": rpc error: code = NotFound desc = could not find container \"d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739\": container with ID starting with d474e600ab68127ab97cfa19b78be750d2eb6f396f08d1ee5e5cd76495438739 not found: ID does not exist" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.248087 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.340366 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6686064d-71cc-4b0d-bcd9-7befbfb27541-logs\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.340440 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.340551 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgrl\" (UniqueName: \"kubernetes.io/projected/6686064d-71cc-4b0d-bcd9-7befbfb27541-kube-api-access-pwgrl\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.340599 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.340638 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-config-data\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.342939 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6686064d-71cc-4b0d-bcd9-7befbfb27541-logs\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.350053 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-config-data\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.350706 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.361405 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.373867 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgrl\" (UniqueName: \"kubernetes.io/projected/6686064d-71cc-4b0d-bcd9-7befbfb27541-kube-api-access-pwgrl\") pod \"nova-metadata-0\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.531573 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.542843 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.754908 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bcjvm"] Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.755131 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerName="dnsmasq-dns" containerID="cri-o://f1e5f87f4d50caabc590bc71bccd0a5aef1a0c7a023ce2635e6eedb76b2b112a" gracePeriod=10 Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.832709 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf11c800-77f4-491f-92da-13efb6916962" path="/var/lib/kubelet/pods/cf11c800-77f4-491f-92da-13efb6916962/volumes" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.918855 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 05:07:38 crc kubenswrapper[4574]: I1004 05:07:38.919339 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.123537 4574 generic.go:334] "Generic (PLEG): container finished" podID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerID="f1e5f87f4d50caabc590bc71bccd0a5aef1a0c7a023ce2635e6eedb76b2b112a" exitCode=0 Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.123645 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" event={"ID":"b6ee152d-8343-47d6-8a16-cfed435bee04","Type":"ContainerDied","Data":"f1e5f87f4d50caabc590bc71bccd0a5aef1a0c7a023ce2635e6eedb76b2b112a"} Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.272377 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:39 crc kubenswrapper[4574]: W1004 05:07:39.272739 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6686064d_71cc_4b0d_bcd9_7befbfb27541.slice/crio-4a527c936a723d979ce578759bf2d65463751417bd5c43001116588b5c285920 WatchSource:0}: Error finding container 4a527c936a723d979ce578759bf2d65463751417bd5c43001116588b5c285920: Status 404 returned error can't find the container with id 4a527c936a723d979ce578759bf2d65463751417bd5c43001116588b5c285920 Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.580548 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.668853 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-nb\") pod \"b6ee152d-8343-47d6-8a16-cfed435bee04\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.669049 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx8mv\" (UniqueName: \"kubernetes.io/projected/b6ee152d-8343-47d6-8a16-cfed435bee04-kube-api-access-rx8mv\") pod \"b6ee152d-8343-47d6-8a16-cfed435bee04\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.669152 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-svc\") pod \"b6ee152d-8343-47d6-8a16-cfed435bee04\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.669204 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-config\") pod \"b6ee152d-8343-47d6-8a16-cfed435bee04\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.669708 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-swift-storage-0\") pod \"b6ee152d-8343-47d6-8a16-cfed435bee04\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.669828 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-sb\") pod \"b6ee152d-8343-47d6-8a16-cfed435bee04\" (UID: \"b6ee152d-8343-47d6-8a16-cfed435bee04\") " Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.679469 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ee152d-8343-47d6-8a16-cfed435bee04-kube-api-access-rx8mv" (OuterVolumeSpecName: "kube-api-access-rx8mv") pod "b6ee152d-8343-47d6-8a16-cfed435bee04" (UID: "b6ee152d-8343-47d6-8a16-cfed435bee04"). InnerVolumeSpecName "kube-api-access-rx8mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.736856 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.773516 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx8mv\" (UniqueName: \"kubernetes.io/projected/b6ee152d-8343-47d6-8a16-cfed435bee04-kube-api-access-rx8mv\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.791992 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6ee152d-8343-47d6-8a16-cfed435bee04" (UID: "b6ee152d-8343-47d6-8a16-cfed435bee04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.809430 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-config" (OuterVolumeSpecName: "config") pod "b6ee152d-8343-47d6-8a16-cfed435bee04" (UID: "b6ee152d-8343-47d6-8a16-cfed435bee04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.809546 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6ee152d-8343-47d6-8a16-cfed435bee04" (UID: "b6ee152d-8343-47d6-8a16-cfed435bee04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.824006 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6ee152d-8343-47d6-8a16-cfed435bee04" (UID: "b6ee152d-8343-47d6-8a16-cfed435bee04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.824901 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6ee152d-8343-47d6-8a16-cfed435bee04" (UID: "b6ee152d-8343-47d6-8a16-cfed435bee04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.843454 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57bfb4d496-nv6hv" podUID="85281a42-f9ab-4302-9fe9-4e742075530f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.875368 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.875611 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.875735 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.875841 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:39 crc kubenswrapper[4574]: I1004 05:07:39.875931 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ee152d-8343-47d6-8a16-cfed435bee04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.164817 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6686064d-71cc-4b0d-bcd9-7befbfb27541","Type":"ContainerStarted","Data":"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9"} Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.165164 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6686064d-71cc-4b0d-bcd9-7befbfb27541","Type":"ContainerStarted","Data":"4a527c936a723d979ce578759bf2d65463751417bd5c43001116588b5c285920"} Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.167604 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" event={"ID":"b6ee152d-8343-47d6-8a16-cfed435bee04","Type":"ContainerDied","Data":"076c21926a5e0e9e76fd035eccb641531423e14ef4b74cdcb8e0663099888c15"} Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.167658 4574 scope.go:117] "RemoveContainer" containerID="f1e5f87f4d50caabc590bc71bccd0a5aef1a0c7a023ce2635e6eedb76b2b112a" Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.167824 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bcjvm" Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.222726 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bcjvm"] Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.235310 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bcjvm"] Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.245048 4574 scope.go:117] "RemoveContainer" containerID="c6e7364d21fdb211214c80ddd787a14c2c37d0919f4cd13c3a74b29feb66d9b4" Oct 04 05:07:40 crc kubenswrapper[4574]: I1004 05:07:40.747645 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" path="/var/lib/kubelet/pods/b6ee152d-8343-47d6-8a16-cfed435bee04/volumes" Oct 04 05:07:41 crc kubenswrapper[4574]: I1004 05:07:41.179754 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6686064d-71cc-4b0d-bcd9-7befbfb27541","Type":"ContainerStarted","Data":"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82"} Oct 04 05:07:43 crc kubenswrapper[4574]: I1004 05:07:43.543588 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 05:07:43 crc kubenswrapper[4574]: I1004 05:07:43.545143 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 05:07:44 crc kubenswrapper[4574]: I1004 05:07:44.210139 4574 generic.go:334] "Generic (PLEG): container finished" podID="cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" containerID="fde8a19cb416d9c2c9e7e3823e81f0ffe0b062d645c96114e137d08076b30e1f" exitCode=0 Oct 04 05:07:44 crc kubenswrapper[4574]: I1004 05:07:44.210266 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dxjss" event={"ID":"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25","Type":"ContainerDied","Data":"fde8a19cb416d9c2c9e7e3823e81f0ffe0b062d645c96114e137d08076b30e1f"} Oct 04 05:07:44 crc kubenswrapper[4574]: I1004 05:07:44.238978 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=6.238954585 podStartE2EDuration="6.238954585s" podCreationTimestamp="2025-10-04 05:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:41.221659556 +0000 UTC m=+1287.075802608" watchObservedRunningTime="2025-10-04 05:07:44.238954585 +0000 UTC m=+1290.093097627" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.226301 4574 generic.go:334] "Generic (PLEG): container finished" podID="9f813f9d-cacd-47ec-9f90-889f59e98949" containerID="df47e842772177fbb92cc2c8c3479bd7afc8882a380feeba9001cf3c22dbe611" exitCode=0 Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.226396 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" event={"ID":"9f813f9d-cacd-47ec-9f90-889f59e98949","Type":"ContainerDied","Data":"df47e842772177fbb92cc2c8c3479bd7afc8882a380feeba9001cf3c22dbe611"} Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.608065 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.705423 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-combined-ca-bundle\") pod \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.705663 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6wgn\" (UniqueName: \"kubernetes.io/projected/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-kube-api-access-n6wgn\") pod \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.705810 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-config-data\") pod \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.705864 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-scripts\") pod \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\" (UID: \"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25\") " Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.710485 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-scripts" (OuterVolumeSpecName: "scripts") pod "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" (UID: "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.712119 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-kube-api-access-n6wgn" (OuterVolumeSpecName: "kube-api-access-n6wgn") pod "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" (UID: "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25"). InnerVolumeSpecName "kube-api-access-n6wgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.734518 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" (UID: "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.736567 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-config-data" (OuterVolumeSpecName: "config-data") pod "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" (UID: "cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.808296 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6wgn\" (UniqueName: \"kubernetes.io/projected/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-kube-api-access-n6wgn\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.808342 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.808352 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:45 crc kubenswrapper[4574]: I1004 05:07:45.808362 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.235747 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dxjss" event={"ID":"cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25","Type":"ContainerDied","Data":"bad33cd36963b5cbff962034b149243e97d7fa047dda6a2a802693e852b83b21"} Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.235797 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad33cd36963b5cbff962034b149243e97d7fa047dda6a2a802693e852b83b21" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.235802 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dxjss" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.459851 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.460104 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-log" containerID="cri-o://8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080" gracePeriod=30 Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.460649 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-api" containerID="cri-o://f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39" gracePeriod=30 Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.473481 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.473743 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" containerName="nova-scheduler-scheduler" containerID="cri-o://8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" gracePeriod=30 Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.489013 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.489325 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-log" containerID="cri-o://652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9" gracePeriod=30 Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.489654 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-metadata" containerID="cri-o://5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82" gracePeriod=30 Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.659814 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.726163 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-combined-ca-bundle\") pod \"9f813f9d-cacd-47ec-9f90-889f59e98949\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.726551 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-config-data\") pod \"9f813f9d-cacd-47ec-9f90-889f59e98949\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.726723 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-scripts\") pod \"9f813f9d-cacd-47ec-9f90-889f59e98949\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.726846 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7r86\" (UniqueName: \"kubernetes.io/projected/9f813f9d-cacd-47ec-9f90-889f59e98949-kube-api-access-c7r86\") pod \"9f813f9d-cacd-47ec-9f90-889f59e98949\" (UID: \"9f813f9d-cacd-47ec-9f90-889f59e98949\") " Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.735009 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f813f9d-cacd-47ec-9f90-889f59e98949-kube-api-access-c7r86" (OuterVolumeSpecName: "kube-api-access-c7r86") pod "9f813f9d-cacd-47ec-9f90-889f59e98949" (UID: "9f813f9d-cacd-47ec-9f90-889f59e98949"). InnerVolumeSpecName "kube-api-access-c7r86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.753994 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-scripts" (OuterVolumeSpecName: "scripts") pod "9f813f9d-cacd-47ec-9f90-889f59e98949" (UID: "9f813f9d-cacd-47ec-9f90-889f59e98949"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.768185 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f813f9d-cacd-47ec-9f90-889f59e98949" (UID: "9f813f9d-cacd-47ec-9f90-889f59e98949"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.793372 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-config-data" (OuterVolumeSpecName: "config-data") pod "9f813f9d-cacd-47ec-9f90-889f59e98949" (UID: "9f813f9d-cacd-47ec-9f90-889f59e98949"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.836811 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.836856 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.836868 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7r86\" (UniqueName: \"kubernetes.io/projected/9f813f9d-cacd-47ec-9f90-889f59e98949-kube-api-access-c7r86\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.836882 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f813f9d-cacd-47ec-9f90-889f59e98949-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:46 crc kubenswrapper[4574]: I1004 05:07:46.997273 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.141481 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwgrl\" (UniqueName: \"kubernetes.io/projected/6686064d-71cc-4b0d-bcd9-7befbfb27541-kube-api-access-pwgrl\") pod \"6686064d-71cc-4b0d-bcd9-7befbfb27541\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.141527 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6686064d-71cc-4b0d-bcd9-7befbfb27541-logs\") pod \"6686064d-71cc-4b0d-bcd9-7befbfb27541\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.141624 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-combined-ca-bundle\") pod \"6686064d-71cc-4b0d-bcd9-7befbfb27541\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.141673 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-nova-metadata-tls-certs\") pod \"6686064d-71cc-4b0d-bcd9-7befbfb27541\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.141945 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6686064d-71cc-4b0d-bcd9-7befbfb27541-logs" (OuterVolumeSpecName: "logs") pod "6686064d-71cc-4b0d-bcd9-7befbfb27541" (UID: "6686064d-71cc-4b0d-bcd9-7befbfb27541"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.142168 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-config-data\") pod \"6686064d-71cc-4b0d-bcd9-7befbfb27541\" (UID: \"6686064d-71cc-4b0d-bcd9-7befbfb27541\") " Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.142653 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6686064d-71cc-4b0d-bcd9-7befbfb27541-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.146418 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6686064d-71cc-4b0d-bcd9-7befbfb27541-kube-api-access-pwgrl" (OuterVolumeSpecName: "kube-api-access-pwgrl") pod "6686064d-71cc-4b0d-bcd9-7befbfb27541" (UID: "6686064d-71cc-4b0d-bcd9-7befbfb27541"). InnerVolumeSpecName "kube-api-access-pwgrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.176372 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-config-data" (OuterVolumeSpecName: "config-data") pod "6686064d-71cc-4b0d-bcd9-7befbfb27541" (UID: "6686064d-71cc-4b0d-bcd9-7befbfb27541"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.179839 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6686064d-71cc-4b0d-bcd9-7befbfb27541" (UID: "6686064d-71cc-4b0d-bcd9-7befbfb27541"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.196854 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6686064d-71cc-4b0d-bcd9-7befbfb27541" (UID: "6686064d-71cc-4b0d-bcd9-7befbfb27541"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.244150 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwgrl\" (UniqueName: \"kubernetes.io/projected/6686064d-71cc-4b0d-bcd9-7befbfb27541-kube-api-access-pwgrl\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.244185 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.244194 4574 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.244204 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6686064d-71cc-4b0d-bcd9-7befbfb27541-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246120 4574 generic.go:334] "Generic (PLEG): container finished" podID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerID="5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82" exitCode=0 Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246143 4574 generic.go:334] "Generic (PLEG): container finished" podID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerID="652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9" exitCode=143 Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246172 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6686064d-71cc-4b0d-bcd9-7befbfb27541","Type":"ContainerDied","Data":"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82"} Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246203 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246224 4574 scope.go:117] "RemoveContainer" containerID="5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246214 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6686064d-71cc-4b0d-bcd9-7befbfb27541","Type":"ContainerDied","Data":"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9"} Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.246360 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6686064d-71cc-4b0d-bcd9-7befbfb27541","Type":"ContainerDied","Data":"4a527c936a723d979ce578759bf2d65463751417bd5c43001116588b5c285920"} Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.251799 4574 generic.go:334] "Generic (PLEG): container finished" podID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerID="8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080" exitCode=143 Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.251874 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b","Type":"ContainerDied","Data":"8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080"} Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.254057 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" event={"ID":"9f813f9d-cacd-47ec-9f90-889f59e98949","Type":"ContainerDied","Data":"aeaa4e396fcbd914073cba3f7adeeaa2bbd17fd11f405becaa4450c7abc69805"} Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.254103 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeaa4e396fcbd914073cba3f7adeeaa2bbd17fd11f405becaa4450c7abc69805" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.254161 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4h6wk" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.287137 4574 scope.go:117] "RemoveContainer" containerID="652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.311780 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.332861 4574 scope.go:117] "RemoveContainer" containerID="5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.333816 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82\": container with ID starting with 5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82 not found: ID does not exist" containerID="5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.333851 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82"} err="failed to get container status \"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82\": rpc error: code = NotFound desc = could not find container \"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82\": container with ID starting with 5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82 not found: ID does not exist" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.333871 4574 scope.go:117] "RemoveContainer" containerID="652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.338863 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.340362 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9\": container with ID starting with 652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9 not found: ID does not exist" containerID="652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.340407 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9"} err="failed to get container status \"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9\": rpc error: code = NotFound desc = could not find container \"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9\": container with ID starting with 652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9 not found: ID does not exist" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.340438 4574 scope.go:117] "RemoveContainer" containerID="5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.340915 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82"} err="failed to get container status \"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82\": rpc error: code = NotFound desc = could not find container \"5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82\": container with ID starting with 5b3ffff4eb59cfdbb6a9c426464eba36eeb9494f3040a03b6762b034cf36dc82 not found: ID does not exist" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.340955 4574 scope.go:117] "RemoveContainer" containerID="652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.341682 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9"} err="failed to get container status \"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9\": rpc error: code = NotFound desc = could not find container \"652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9\": container with ID starting with 652f8abf3cfc26f534f24c908fbed771df6f395de367198098c96d05803c38f9 not found: ID does not exist" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351310 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.351833 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f813f9d-cacd-47ec-9f90-889f59e98949" containerName="nova-cell1-conductor-db-sync" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351854 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f813f9d-cacd-47ec-9f90-889f59e98949" containerName="nova-cell1-conductor-db-sync" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.351881 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-log" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351890 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-log" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.351911 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-metadata" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351918 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-metadata" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.351937 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerName="init" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351945 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerName="init" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.351962 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" containerName="nova-manage" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351969 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" containerName="nova-manage" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.351982 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerName="dnsmasq-dns" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.351991 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerName="dnsmasq-dns" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.352280 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ee152d-8343-47d6-8a16-cfed435bee04" containerName="dnsmasq-dns" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.352300 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-metadata" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.352311 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" containerName="nova-manage" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.352324 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" containerName="nova-metadata-log" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.352336 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f813f9d-cacd-47ec-9f90-889f59e98949" containerName="nova-cell1-conductor-db-sync" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.353445 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.359265 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.359500 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.367396 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.368976 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.371482 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.382706 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.391011 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447002 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdl4\" (UniqueName: \"kubernetes.io/projected/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-kube-api-access-4cdl4\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447065 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447086 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447142 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-logs\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447182 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pls2m\" (UniqueName: \"kubernetes.io/projected/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-kube-api-access-pls2m\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447217 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447283 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-config-data\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.447324 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550354 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550444 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-config-data\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550503 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550587 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdl4\" (UniqueName: \"kubernetes.io/projected/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-kube-api-access-4cdl4\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550622 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550648 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550712 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-logs\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.550749 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pls2m\" (UniqueName: \"kubernetes.io/projected/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-kube-api-access-pls2m\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.551760 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-logs\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.555612 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-config-data\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.556729 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.556928 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.557271 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.561055 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.574185 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdl4\" (UniqueName: \"kubernetes.io/projected/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-kube-api-access-4cdl4\") pod \"nova-metadata-0\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.577379 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pls2m\" (UniqueName: \"kubernetes.io/projected/f6ae6da5-dd08-46b4-94cf-589b9c4f5139-kube-api-access-pls2m\") pod \"nova-cell1-conductor-0\" (UID: \"f6ae6da5-dd08-46b4-94cf-589b9c4f5139\") " pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.679879 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:07:47 crc kubenswrapper[4574]: I1004 05:07:47.694625 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.991173 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.992959 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.994051 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 04 05:07:47 crc kubenswrapper[4574]: E1004 05:07:47.994083 4574 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" containerName="nova-scheduler-scheduler" Oct 04 05:07:48 crc kubenswrapper[4574]: I1004 05:07:48.154891 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:07:48 crc kubenswrapper[4574]: W1004 05:07:48.157859 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c0b789f_0313_4df9_8a95_cfd4ec60f6dc.slice/crio-03d84ea732ce47e319d6658dc24a8c7531359676f9ccf76b35db0917d5a256d0 WatchSource:0}: Error finding container 03d84ea732ce47e319d6658dc24a8c7531359676f9ccf76b35db0917d5a256d0: Status 404 returned error can't find the container with id 03d84ea732ce47e319d6658dc24a8c7531359676f9ccf76b35db0917d5a256d0 Oct 04 05:07:48 crc kubenswrapper[4574]: I1004 05:07:48.271455 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc","Type":"ContainerStarted","Data":"03d84ea732ce47e319d6658dc24a8c7531359676f9ccf76b35db0917d5a256d0"} Oct 04 05:07:48 crc kubenswrapper[4574]: I1004 05:07:48.302349 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 04 05:07:48 crc kubenswrapper[4574]: I1004 05:07:48.755474 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6686064d-71cc-4b0d-bcd9-7befbfb27541" path="/var/lib/kubelet/pods/6686064d-71cc-4b0d-bcd9-7befbfb27541/volumes" Oct 04 05:07:48 crc kubenswrapper[4574]: I1004 05:07:48.992692 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.289680 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc","Type":"ContainerStarted","Data":"9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b"} Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.290028 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc","Type":"ContainerStarted","Data":"2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91"} Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.300372 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6ae6da5-dd08-46b4-94cf-589b9c4f5139","Type":"ContainerStarted","Data":"df044cf43b7b56ee536c9902ecfbf2d5348fa4a10df429cad412f1e7b378f633"} Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.300431 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6ae6da5-dd08-46b4-94cf-589b9c4f5139","Type":"ContainerStarted","Data":"3df4aa346f8bb6f2b4547ef7c82eea1410ead6fb6bb8ffe3b0333c65604498cc"} Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.300676 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.319992 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.319975323 podStartE2EDuration="2.319975323s" podCreationTimestamp="2025-10-04 05:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:49.314642879 +0000 UTC m=+1295.168785921" watchObservedRunningTime="2025-10-04 05:07:49.319975323 +0000 UTC m=+1295.174118365" Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.344039 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.344023859 podStartE2EDuration="2.344023859s" podCreationTimestamp="2025-10-04 05:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:49.337308105 +0000 UTC m=+1295.191451147" watchObservedRunningTime="2025-10-04 05:07:49.344023859 +0000 UTC m=+1295.198166901" Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.404250 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:07:49 crc kubenswrapper[4574]: I1004 05:07:49.404302 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.073965 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.202881 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwwj7\" (UniqueName: \"kubernetes.io/projected/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-kube-api-access-hwwj7\") pod \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.203034 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-combined-ca-bundle\") pod \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.203202 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-config-data\") pod \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.203282 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-logs\") pod \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\" (UID: \"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b\") " Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.204183 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-logs" (OuterVolumeSpecName: "logs") pod "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" (UID: "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.224734 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-kube-api-access-hwwj7" (OuterVolumeSpecName: "kube-api-access-hwwj7") pod "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" (UID: "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b"). InnerVolumeSpecName "kube-api-access-hwwj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.261317 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-config-data" (OuterVolumeSpecName: "config-data") pod "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" (UID: "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.269302 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" (UID: "d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.314622 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.314668 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.314684 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.314699 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwwj7\" (UniqueName: \"kubernetes.io/projected/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b-kube-api-access-hwwj7\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.327562 4574 generic.go:334] "Generic (PLEG): container finished" podID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerID="f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39" exitCode=0 Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.328555 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.337347 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b","Type":"ContainerDied","Data":"f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39"} Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.337722 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b","Type":"ContainerDied","Data":"e7910f258bded4499b06e20bc3ad5595c19d330b98e1ba1562a5da4a042c6012"} Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.337752 4574 scope.go:117] "RemoveContainer" containerID="f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.378633 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.395101 4574 scope.go:117] "RemoveContainer" containerID="8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.404517 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.425653 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:50 crc kubenswrapper[4574]: E1004 05:07:50.426207 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-api" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.426226 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-api" Oct 04 05:07:50 crc kubenswrapper[4574]: E1004 05:07:50.426255 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-log" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.426263 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-log" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.426482 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-api" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.426504 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" containerName="nova-api-log" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.428200 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.431638 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.436493 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.440835 4574 scope.go:117] "RemoveContainer" containerID="f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39" Oct 04 05:07:50 crc kubenswrapper[4574]: E1004 05:07:50.441558 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39\": container with ID starting with f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39 not found: ID does not exist" containerID="f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.441608 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39"} err="failed to get container status \"f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39\": rpc error: code = NotFound desc = could not find container \"f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39\": container with ID starting with f9dd56bf751443a6260c5e07bcf663a737c5ba6cb9e134b590080fb9c629ce39 not found: ID does not exist" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.441632 4574 scope.go:117] "RemoveContainer" containerID="8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080" Oct 04 05:07:50 crc kubenswrapper[4574]: E1004 05:07:50.441972 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080\": container with ID starting with 8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080 not found: ID does not exist" containerID="8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.441994 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080"} err="failed to get container status \"8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080\": rpc error: code = NotFound desc = could not find container \"8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080\": container with ID starting with 8d8cec8067beb170ff28cabfe3ebedc32b7e51448655d43db7ff894551f7d080 not found: ID does not exist" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.520148 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-logs\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.520207 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-config-data\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.520283 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.520331 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78tws\" (UniqueName: \"kubernetes.io/projected/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-kube-api-access-78tws\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.621813 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-logs\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.621856 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-config-data\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.621921 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.621976 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78tws\" (UniqueName: \"kubernetes.io/projected/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-kube-api-access-78tws\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.622541 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-logs\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.627151 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.627842 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-config-data\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.644866 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78tws\" (UniqueName: \"kubernetes.io/projected/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-kube-api-access-78tws\") pod \"nova-api-0\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.758441 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.776800 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b" path="/var/lib/kubelet/pods/d0b82e1a-4a98-4ad8-8f3e-f813986e3e9b/volumes" Oct 04 05:07:50 crc kubenswrapper[4574]: I1004 05:07:50.901688 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.036413 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-combined-ca-bundle\") pod \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.036606 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-config-data\") pod \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.036924 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7phqz\" (UniqueName: \"kubernetes.io/projected/bf330144-5f7d-44a5-ae98-85860c9d5ce5-kube-api-access-7phqz\") pod \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\" (UID: \"bf330144-5f7d-44a5-ae98-85860c9d5ce5\") " Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.041294 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf330144-5f7d-44a5-ae98-85860c9d5ce5-kube-api-access-7phqz" (OuterVolumeSpecName: "kube-api-access-7phqz") pod "bf330144-5f7d-44a5-ae98-85860c9d5ce5" (UID: "bf330144-5f7d-44a5-ae98-85860c9d5ce5"). InnerVolumeSpecName "kube-api-access-7phqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.067633 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-config-data" (OuterVolumeSpecName: "config-data") pod "bf330144-5f7d-44a5-ae98-85860c9d5ce5" (UID: "bf330144-5f7d-44a5-ae98-85860c9d5ce5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.133369 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf330144-5f7d-44a5-ae98-85860c9d5ce5" (UID: "bf330144-5f7d-44a5-ae98-85860c9d5ce5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.139166 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7phqz\" (UniqueName: \"kubernetes.io/projected/bf330144-5f7d-44a5-ae98-85860c9d5ce5-kube-api-access-7phqz\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.139203 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.139214 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf330144-5f7d-44a5-ae98-85860c9d5ce5-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.346979 4574 generic.go:334] "Generic (PLEG): container finished" podID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" exitCode=0 Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.347080 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf330144-5f7d-44a5-ae98-85860c9d5ce5","Type":"ContainerDied","Data":"8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76"} Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.347307 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf330144-5f7d-44a5-ae98-85860c9d5ce5","Type":"ContainerDied","Data":"f07182c5f77060230ddd49741922e3c55f7aa82973ed534d517ada58466d7527"} Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.347330 4574 scope.go:117] "RemoveContainer" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.347122 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.389416 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.398345 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.398437 4574 scope.go:117] "RemoveContainer" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" Oct 04 05:07:51 crc kubenswrapper[4574]: E1004 05:07:51.400832 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76\": container with ID starting with 8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76 not found: ID does not exist" containerID="8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.400886 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76"} err="failed to get container status \"8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76\": rpc error: code = NotFound desc = could not find container \"8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76\": container with ID starting with 8380b0c3dece602cf1f0f1b66a55e41570fac6e565510ea00ca8cd66ae9a5f76 not found: ID does not exist" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.427540 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.441968 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:51 crc kubenswrapper[4574]: E1004 05:07:51.442561 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" containerName="nova-scheduler-scheduler" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.442658 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" containerName="nova-scheduler-scheduler" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.443198 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" containerName="nova-scheduler-scheduler" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.445123 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.456872 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.457730 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.546464 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznxr\" (UniqueName: \"kubernetes.io/projected/a0e56256-b1dd-46b7-a662-80a85f177980-kube-api-access-xznxr\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.546817 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-config-data\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.547036 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.649341 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xznxr\" (UniqueName: \"kubernetes.io/projected/a0e56256-b1dd-46b7-a662-80a85f177980-kube-api-access-xznxr\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.649394 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-config-data\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.649456 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.655756 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-config-data\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.655984 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.674213 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznxr\" (UniqueName: \"kubernetes.io/projected/a0e56256-b1dd-46b7-a662-80a85f177980-kube-api-access-xznxr\") pod \"nova-scheduler-0\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " pod="openstack/nova-scheduler-0" Oct 04 05:07:51 crc kubenswrapper[4574]: I1004 05:07:51.769375 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:07:52 crc kubenswrapper[4574]: W1004 05:07:52.307438 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0e56256_b1dd_46b7_a662_80a85f177980.slice/crio-51a8812350313affd6635bae1a0938d3825682b4fb0d0bba81635a19ce43ea62 WatchSource:0}: Error finding container 51a8812350313affd6635bae1a0938d3825682b4fb0d0bba81635a19ce43ea62: Status 404 returned error can't find the container with id 51a8812350313affd6635bae1a0938d3825682b4fb0d0bba81635a19ce43ea62 Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.310959 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.370230 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7","Type":"ContainerStarted","Data":"9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a"} Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.376365 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7","Type":"ContainerStarted","Data":"0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd"} Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.376388 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7","Type":"ContainerStarted","Data":"e26111a9d3e5fbfc967bde7e5819b91db61f52122db0299655121e87fe59c32a"} Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.385140 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a0e56256-b1dd-46b7-a662-80a85f177980","Type":"ContainerStarted","Data":"51a8812350313affd6635bae1a0938d3825682b4fb0d0bba81635a19ce43ea62"} Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.391921 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.391907175 podStartE2EDuration="2.391907175s" podCreationTimestamp="2025-10-04 05:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:52.389606108 +0000 UTC m=+1298.243749150" watchObservedRunningTime="2025-10-04 05:07:52.391907175 +0000 UTC m=+1298.246050217" Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.680780 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.680908 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 05:07:52 crc kubenswrapper[4574]: I1004 05:07:52.750297 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf330144-5f7d-44a5-ae98-85860c9d5ce5" path="/var/lib/kubelet/pods/bf330144-5f7d-44a5-ae98-85860c9d5ce5/volumes" Oct 04 05:07:53 crc kubenswrapper[4574]: I1004 05:07:53.396726 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a0e56256-b1dd-46b7-a662-80a85f177980","Type":"ContainerStarted","Data":"37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1"} Oct 04 05:07:53 crc kubenswrapper[4574]: I1004 05:07:53.424409 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.424385541 podStartE2EDuration="2.424385541s" podCreationTimestamp="2025-10-04 05:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:07:53.414155205 +0000 UTC m=+1299.268298247" watchObservedRunningTime="2025-10-04 05:07:53.424385541 +0000 UTC m=+1299.278528583" Oct 04 05:07:54 crc kubenswrapper[4574]: I1004 05:07:54.321429 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:07:54 crc kubenswrapper[4574]: I1004 05:07:54.404767 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:07:54 crc kubenswrapper[4574]: I1004 05:07:54.553390 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:07:54 crc kubenswrapper[4574]: I1004 05:07:54.553648 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" containerName="kube-state-metrics" containerID="cri-o://9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4" gracePeriod=30 Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.107608 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.232179 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4qqz\" (UniqueName: \"kubernetes.io/projected/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339-kube-api-access-f4qqz\") pod \"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339\" (UID: \"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339\") " Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.239435 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339-kube-api-access-f4qqz" (OuterVolumeSpecName: "kube-api-access-f4qqz") pod "f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" (UID: "f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339"). InnerVolumeSpecName "kube-api-access-f4qqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.334187 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4qqz\" (UniqueName: \"kubernetes.io/projected/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339-kube-api-access-f4qqz\") on node \"crc\" DevicePath \"\"" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.420171 4574 generic.go:334] "Generic (PLEG): container finished" podID="f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" containerID="9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4" exitCode=2 Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.420220 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339","Type":"ContainerDied","Data":"9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4"} Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.420339 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339","Type":"ContainerDied","Data":"c35f8326c2d5c469e5ee052988be4c903d08b356313b8f8609f3d89d6fdc0088"} Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.420364 4574 scope.go:117] "RemoveContainer" containerID="9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.420460 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.469267 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.469879 4574 scope.go:117] "RemoveContainer" containerID="9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4" Oct 04 05:07:55 crc kubenswrapper[4574]: E1004 05:07:55.470262 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4\": container with ID starting with 9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4 not found: ID does not exist" containerID="9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.471489 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4"} err="failed to get container status \"9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4\": rpc error: code = NotFound desc = could not find container \"9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4\": container with ID starting with 9c5b4b06a770eb69a2f415d8ee87d08636098a80d9e3de125f1d47a33e6d94f4 not found: ID does not exist" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.490985 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.499786 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:07:55 crc kubenswrapper[4574]: E1004 05:07:55.500271 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" containerName="kube-state-metrics" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.500292 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" containerName="kube-state-metrics" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.500525 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" containerName="kube-state-metrics" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.501200 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.504015 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.508561 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.517782 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.639588 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.639655 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfg9t\" (UniqueName: \"kubernetes.io/projected/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-api-access-jfg9t\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.639730 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.639890 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.741862 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.742206 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfg9t\" (UniqueName: \"kubernetes.io/projected/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-api-access-jfg9t\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.742499 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.743211 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.746695 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.747945 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.757957 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd47459a-7171-4d7e-8f65-20a2936ce760-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.763542 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfg9t\" (UniqueName: \"kubernetes.io/projected/cd47459a-7171-4d7e-8f65-20a2936ce760-kube-api-access-jfg9t\") pod \"kube-state-metrics-0\" (UID: \"cd47459a-7171-4d7e-8f65-20a2936ce760\") " pod="openstack/kube-state-metrics-0" Oct 04 05:07:55 crc kubenswrapper[4574]: I1004 05:07:55.830401 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.348779 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 04 05:07:56 crc kubenswrapper[4574]: W1004 05:07:56.355570 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd47459a_7171_4d7e_8f65_20a2936ce760.slice/crio-1e64bad7e6877191f31facf552505f43b196255bff46a478beb5149748c7807b WatchSource:0}: Error finding container 1e64bad7e6877191f31facf552505f43b196255bff46a478beb5149748c7807b: Status 404 returned error can't find the container with id 1e64bad7e6877191f31facf552505f43b196255bff46a478beb5149748c7807b Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.358200 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.431553 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd47459a-7171-4d7e-8f65-20a2936ce760","Type":"ContainerStarted","Data":"1e64bad7e6877191f31facf552505f43b196255bff46a478beb5149748c7807b"} Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.584312 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.748826 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339" path="/var/lib/kubelet/pods/f32ab1b1-0d3b-4e5d-8e26-770aa0e9f339/volumes" Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.770227 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.849862 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-57bfb4d496-nv6hv" Oct 04 05:07:56 crc kubenswrapper[4574]: I1004 05:07:56.949896 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57c7ff446b-7tmwn"] Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.209422 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.210052 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="proxy-httpd" containerID="cri-o://5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012" gracePeriod=30 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.210083 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-notification-agent" containerID="cri-o://873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260" gracePeriod=30 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.210259 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="sg-core" containerID="cri-o://ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725" gracePeriod=30 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.210527 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-central-agent" containerID="cri-o://bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474" gracePeriod=30 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.443748 4574 generic.go:334] "Generic (PLEG): container finished" podID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerID="5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012" exitCode=0 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.443791 4574 generic.go:334] "Generic (PLEG): container finished" podID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerID="ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725" exitCode=2 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.443847 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerDied","Data":"5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012"} Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.443902 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerDied","Data":"ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725"} Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.446269 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon-log" containerID="cri-o://98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa" gracePeriod=30 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.447510 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cd47459a-7171-4d7e-8f65-20a2936ce760","Type":"ContainerStarted","Data":"1196ef0e565f859ae127cc20f0f7c4e9e0af90fd838ccb156dbbc72df20953ab"} Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.447548 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.447884 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" containerID="cri-o://403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469" gracePeriod=30 Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.483182 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.098374161 podStartE2EDuration="2.483159899s" podCreationTimestamp="2025-10-04 05:07:55 +0000 UTC" firstStartedPulling="2025-10-04 05:07:56.357959958 +0000 UTC m=+1302.212102990" lastFinishedPulling="2025-10-04 05:07:56.742745686 +0000 UTC m=+1302.596888728" observedRunningTime="2025-10-04 05:07:57.473593232 +0000 UTC m=+1303.327736274" watchObservedRunningTime="2025-10-04 05:07:57.483159899 +0000 UTC m=+1303.337302941" Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.681498 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.682136 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 05:07:57 crc kubenswrapper[4574]: I1004 05:07:57.727747 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 04 05:07:58 crc kubenswrapper[4574]: I1004 05:07:58.458617 4574 generic.go:334] "Generic (PLEG): container finished" podID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerID="bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474" exitCode=0 Oct 04 05:07:58 crc kubenswrapper[4574]: I1004 05:07:58.458681 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerDied","Data":"bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474"} Oct 04 05:07:58 crc kubenswrapper[4574]: I1004 05:07:58.696400 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:07:58 crc kubenswrapper[4574]: I1004 05:07:58.696400 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:07:59 crc kubenswrapper[4574]: E1004 05:07:59.711938 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8be556e_5e14_4b4f_8d2f_a28f68c521fb.slice/crio-873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8be556e_5e14_4b4f_8d2f_a28f68c521fb.slice/crio-conmon-873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:07:59 crc kubenswrapper[4574]: I1004 05:07:59.952484 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035617 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-combined-ca-bundle\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035715 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-run-httpd\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035748 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-scripts\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035869 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-config-data\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035902 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5lfs\" (UniqueName: \"kubernetes.io/projected/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-kube-api-access-g5lfs\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035941 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-sg-core-conf-yaml\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.035977 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-log-httpd\") pod \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\" (UID: \"b8be556e-5e14-4b4f-8d2f-a28f68c521fb\") " Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.038491 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.046936 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.047836 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-kube-api-access-g5lfs" (OuterVolumeSpecName: "kube-api-access-g5lfs") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "kube-api-access-g5lfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.058813 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-scripts" (OuterVolumeSpecName: "scripts") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.137955 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.138296 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.138310 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5lfs\" (UniqueName: \"kubernetes.io/projected/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-kube-api-access-g5lfs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.138322 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.165405 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.205103 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-config-data" (OuterVolumeSpecName: "config-data") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.213106 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8be556e-5e14-4b4f-8d2f-a28f68c521fb" (UID: "b8be556e-5e14-4b4f-8d2f-a28f68c521fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.240247 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.240551 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.240643 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8be556e-5e14-4b4f-8d2f-a28f68c521fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.484504 4574 generic.go:334] "Generic (PLEG): container finished" podID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerID="873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260" exitCode=0 Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.484545 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerDied","Data":"873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260"} Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.484570 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8be556e-5e14-4b4f-8d2f-a28f68c521fb","Type":"ContainerDied","Data":"1c6257107fdb1ade05bf2e6d7dc6c20d2548fff19c1401002ecfc0862a7cc982"} Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.484586 4574 scope.go:117] "RemoveContainer" containerID="5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.484731 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.530352 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.543715 4574 scope.go:117] "RemoveContainer" containerID="ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.583460 4574 scope.go:117] "RemoveContainer" containerID="873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.605713 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.611699 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:34080->10.217.0.143:8443: read: connection reset by peer" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.624655 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.625205 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-central-agent" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625223 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-central-agent" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.625256 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-notification-agent" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625263 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-notification-agent" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.625295 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="sg-core" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625301 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="sg-core" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.625328 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="proxy-httpd" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625335 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="proxy-httpd" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625542 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="sg-core" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625576 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="proxy-httpd" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625590 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-central-agent" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.625601 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" containerName="ceilometer-notification-agent" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.627551 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.632002 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.632283 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.632443 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.634285 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.652051 4574 scope.go:117] "RemoveContainer" containerID="bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.740040 4574 scope.go:117] "RemoveContainer" containerID="5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.740973 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012\": container with ID starting with 5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012 not found: ID does not exist" containerID="5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.741078 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012"} err="failed to get container status \"5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012\": rpc error: code = NotFound desc = could not find container \"5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012\": container with ID starting with 5e044d0f18281e4f84eb26ca045f74ba95a46b4cd6480145109711dfb29d2012 not found: ID does not exist" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.741161 4574 scope.go:117] "RemoveContainer" containerID="ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.743489 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725\": container with ID starting with ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725 not found: ID does not exist" containerID="ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.743528 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725"} err="failed to get container status \"ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725\": rpc error: code = NotFound desc = could not find container \"ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725\": container with ID starting with ddbf0d32dfaf7f7bc155713694fa9332044443918c813d66c57b32c763904725 not found: ID does not exist" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.743552 4574 scope.go:117] "RemoveContainer" containerID="873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.743773 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260\": container with ID starting with 873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260 not found: ID does not exist" containerID="873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.743790 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260"} err="failed to get container status \"873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260\": rpc error: code = NotFound desc = could not find container \"873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260\": container with ID starting with 873d6fb5d24efdbdc2bef39b04fa17f61a4e80d2971b3385d4988aa12bb49260 not found: ID does not exist" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.743802 4574 scope.go:117] "RemoveContainer" containerID="bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474" Oct 04 05:08:00 crc kubenswrapper[4574]: E1004 05:08:00.743971 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474\": container with ID starting with bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474 not found: ID does not exist" containerID="bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.743988 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474"} err="failed to get container status \"bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474\": rpc error: code = NotFound desc = could not find container \"bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474\": container with ID starting with bb012ab4ec6ab1dc90421bf12d97d0009167a151cfbc2ceaa86d05d4d2fbe474 not found: ID does not exist" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.750821 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.750859 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-log-httpd\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.750930 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.750957 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-scripts\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.750975 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-config-data\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.750992 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-run-httpd\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.751010 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzs2m\" (UniqueName: \"kubernetes.io/projected/45499119-0d4a-4c71-9eb1-63484531f1af-kube-api-access-xzs2m\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.751069 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.754464 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8be556e-5e14-4b4f-8d2f-a28f68c521fb" path="/var/lib/kubelet/pods/b8be556e-5e14-4b4f-8d2f-a28f68c521fb/volumes" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.760579 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.760636 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852468 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852527 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-scripts\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852556 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-config-data\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852582 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-run-httpd\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852606 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzs2m\" (UniqueName: \"kubernetes.io/projected/45499119-0d4a-4c71-9eb1-63484531f1af-kube-api-access-xzs2m\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852692 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852770 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.852794 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-log-httpd\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.853432 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-run-httpd\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.854078 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-log-httpd\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.861649 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.872799 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzs2m\" (UniqueName: \"kubernetes.io/projected/45499119-0d4a-4c71-9eb1-63484531f1af-kube-api-access-xzs2m\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.875244 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.876752 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-scripts\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.880884 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-config-data\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:00 crc kubenswrapper[4574]: I1004 05:08:00.881962 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " pod="openstack/ceilometer-0" Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.042947 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.520098 4574 generic.go:334] "Generic (PLEG): container finished" podID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerID="403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469" exitCode=0 Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.520174 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerDied","Data":"403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469"} Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.520211 4574 scope.go:117] "RemoveContainer" containerID="6d70675baea48ecf302727fcdbee1acb4f0596c28e0e43a0d20ca60fc171a6ba" Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.579608 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.770339 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.807025 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.843585 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:01 crc kubenswrapper[4574]: I1004 05:08:01.843668 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:02 crc kubenswrapper[4574]: I1004 05:08:02.529434 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerStarted","Data":"7eada33b790be6bbfeee91063e1a6a4d29e7475c181ff0e05ec8ad9d93a1a8f3"} Oct 04 05:08:02 crc kubenswrapper[4574]: I1004 05:08:02.567677 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 04 05:08:03 crc kubenswrapper[4574]: I1004 05:08:03.548114 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerStarted","Data":"d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf"} Oct 04 05:08:03 crc kubenswrapper[4574]: I1004 05:08:03.549512 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerStarted","Data":"b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d"} Oct 04 05:08:05 crc kubenswrapper[4574]: I1004 05:08:05.567328 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerStarted","Data":"a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb"} Oct 04 05:08:05 crc kubenswrapper[4574]: I1004 05:08:05.839635 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 04 05:08:06 crc kubenswrapper[4574]: I1004 05:08:06.578901 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerStarted","Data":"36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9"} Oct 04 05:08:06 crc kubenswrapper[4574]: I1004 05:08:06.579834 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.470534 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.484661 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-combined-ca-bundle\") pod \"ae90061f-8906-44b4-8195-286492c8d770\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.484918 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-config-data\") pod \"ae90061f-8906-44b4-8195-286492c8d770\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.484980 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hxjd\" (UniqueName: \"kubernetes.io/projected/ae90061f-8906-44b4-8195-286492c8d770-kube-api-access-6hxjd\") pod \"ae90061f-8906-44b4-8195-286492c8d770\" (UID: \"ae90061f-8906-44b4-8195-286492c8d770\") " Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.501637 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.268003588 podStartE2EDuration="7.501602176s" podCreationTimestamp="2025-10-04 05:08:00 +0000 UTC" firstStartedPulling="2025-10-04 05:08:01.754908079 +0000 UTC m=+1307.609051121" lastFinishedPulling="2025-10-04 05:08:05.988506667 +0000 UTC m=+1311.842649709" observedRunningTime="2025-10-04 05:08:06.606728102 +0000 UTC m=+1312.460871144" watchObservedRunningTime="2025-10-04 05:08:07.501602176 +0000 UTC m=+1313.355745218" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.508514 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae90061f-8906-44b4-8195-286492c8d770-kube-api-access-6hxjd" (OuterVolumeSpecName: "kube-api-access-6hxjd") pod "ae90061f-8906-44b4-8195-286492c8d770" (UID: "ae90061f-8906-44b4-8195-286492c8d770"). InnerVolumeSpecName "kube-api-access-6hxjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.553444 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae90061f-8906-44b4-8195-286492c8d770" (UID: "ae90061f-8906-44b4-8195-286492c8d770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.555844 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-config-data" (OuterVolumeSpecName: "config-data") pod "ae90061f-8906-44b4-8195-286492c8d770" (UID: "ae90061f-8906-44b4-8195-286492c8d770"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.589479 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.589518 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae90061f-8906-44b4-8195-286492c8d770-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.589542 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hxjd\" (UniqueName: \"kubernetes.io/projected/ae90061f-8906-44b4-8195-286492c8d770-kube-api-access-6hxjd\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.602676 4574 generic.go:334] "Generic (PLEG): container finished" podID="ae90061f-8906-44b4-8195-286492c8d770" containerID="6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d" exitCode=137 Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.603129 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.604015 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ae90061f-8906-44b4-8195-286492c8d770","Type":"ContainerDied","Data":"6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d"} Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.604046 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ae90061f-8906-44b4-8195-286492c8d770","Type":"ContainerDied","Data":"bbddb2fafce520744812b62521a1515d4b5418309521217dc8284bdc346f9d16"} Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.604062 4574 scope.go:117] "RemoveContainer" containerID="6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.646500 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.649583 4574 scope.go:117] "RemoveContainer" containerID="6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d" Oct 04 05:08:07 crc kubenswrapper[4574]: E1004 05:08:07.650651 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d\": container with ID starting with 6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d not found: ID does not exist" containerID="6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.650711 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d"} err="failed to get container status \"6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d\": rpc error: code = NotFound desc = could not find container \"6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d\": container with ID starting with 6ebeea41ad053fe53d800411c3340a0df7ce6950750124067e4694ae86422b5d not found: ID does not exist" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.661457 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.670907 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:08:07 crc kubenswrapper[4574]: E1004 05:08:07.671495 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae90061f-8906-44b4-8195-286492c8d770" containerName="nova-cell1-novncproxy-novncproxy" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.671520 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae90061f-8906-44b4-8195-286492c8d770" containerName="nova-cell1-novncproxy-novncproxy" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.671800 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae90061f-8906-44b4-8195-286492c8d770" containerName="nova-cell1-novncproxy-novncproxy" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.672691 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.675993 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.676021 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.677489 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.682529 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.691192 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxnd\" (UniqueName: \"kubernetes.io/projected/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-kube-api-access-tkxnd\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.691259 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.691344 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.691393 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.691725 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.715740 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.715806 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.731160 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.733545 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.801166 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkxnd\" (UniqueName: \"kubernetes.io/projected/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-kube-api-access-tkxnd\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.801223 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.801262 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.801285 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.801760 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.812276 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.813302 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.819885 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.831015 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkxnd\" (UniqueName: \"kubernetes.io/projected/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-kube-api-access-tkxnd\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:07 crc kubenswrapper[4574]: I1004 05:08:07.835172 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dfa220-f267-43c2-9b28-4dc23a4a3eeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"43dfa220-f267-43c2-9b28-4dc23a4a3eeb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:08 crc kubenswrapper[4574]: I1004 05:08:08.005112 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:08 crc kubenswrapper[4574]: I1004 05:08:08.562616 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 04 05:08:08 crc kubenswrapper[4574]: I1004 05:08:08.625107 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"43dfa220-f267-43c2-9b28-4dc23a4a3eeb","Type":"ContainerStarted","Data":"8e07885a57dd5ecc209c8839d18fd48e8ad0922efb7056695af316a27b848d70"} Oct 04 05:08:08 crc kubenswrapper[4574]: I1004 05:08:08.749153 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae90061f-8906-44b4-8195-286492c8d770" path="/var/lib/kubelet/pods/ae90061f-8906-44b4-8195-286492c8d770/volumes" Oct 04 05:08:09 crc kubenswrapper[4574]: I1004 05:08:09.648628 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"43dfa220-f267-43c2-9b28-4dc23a4a3eeb","Type":"ContainerStarted","Data":"55f174849c01e1ad032674615880bac0b0f3614fcde5b0e91de794259bc6e42f"} Oct 04 05:08:09 crc kubenswrapper[4574]: I1004 05:08:09.668940 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.668917441 podStartE2EDuration="2.668917441s" podCreationTimestamp="2025-10-04 05:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:09.664811322 +0000 UTC m=+1315.518954384" watchObservedRunningTime="2025-10-04 05:08:09.668917441 +0000 UTC m=+1315.523060483" Oct 04 05:08:09 crc kubenswrapper[4574]: I1004 05:08:09.711169 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:08:10 crc kubenswrapper[4574]: I1004 05:08:10.764351 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 05:08:10 crc kubenswrapper[4574]: I1004 05:08:10.764933 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 05:08:10 crc kubenswrapper[4574]: I1004 05:08:10.765585 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 05:08:10 crc kubenswrapper[4574]: I1004 05:08:10.766094 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 05:08:10 crc kubenswrapper[4574]: I1004 05:08:10.773478 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 05:08:10 crc kubenswrapper[4574]: I1004 05:08:10.781590 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.001363 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fbzln"] Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.015949 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.087377 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.087630 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.087827 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.087948 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-config\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.088019 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.088142 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4nw\" (UniqueName: \"kubernetes.io/projected/d04c0b72-4e8a-40e4-a654-fed2b063729d-kube-api-access-dn4nw\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.110903 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fbzln"] Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.191669 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-config\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.191848 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.192743 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.192769 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-config\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.192785 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4nw\" (UniqueName: \"kubernetes.io/projected/d04c0b72-4e8a-40e4-a654-fed2b063729d-kube-api-access-dn4nw\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.192920 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.193092 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.193419 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.194142 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.194679 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.194708 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.225180 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4nw\" (UniqueName: \"kubernetes.io/projected/d04c0b72-4e8a-40e4-a654-fed2b063729d-kube-api-access-dn4nw\") pod \"dnsmasq-dns-59cf4bdb65-fbzln\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:11 crc kubenswrapper[4574]: I1004 05:08:11.399401 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:12 crc kubenswrapper[4574]: I1004 05:08:12.123454 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fbzln"] Oct 04 05:08:12 crc kubenswrapper[4574]: I1004 05:08:12.674815 4574 generic.go:334] "Generic (PLEG): container finished" podID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerID="ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e" exitCode=0 Oct 04 05:08:12 crc kubenswrapper[4574]: I1004 05:08:12.676827 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" event={"ID":"d04c0b72-4e8a-40e4-a654-fed2b063729d","Type":"ContainerDied","Data":"ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e"} Oct 04 05:08:12 crc kubenswrapper[4574]: I1004 05:08:12.676865 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" event={"ID":"d04c0b72-4e8a-40e4-a654-fed2b063729d","Type":"ContainerStarted","Data":"75e9e7a7450259c1a669d8224b88dbf4fe629f5375a3919f4f4d0c151ec38987"} Oct 04 05:08:13 crc kubenswrapper[4574]: I1004 05:08:13.006484 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:13 crc kubenswrapper[4574]: I1004 05:08:13.659290 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:13 crc kubenswrapper[4574]: I1004 05:08:13.703260 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" event={"ID":"d04c0b72-4e8a-40e4-a654-fed2b063729d","Type":"ContainerStarted","Data":"4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae"} Oct 04 05:08:13 crc kubenswrapper[4574]: I1004 05:08:13.703509 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-log" containerID="cri-o://0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd" gracePeriod=30 Oct 04 05:08:13 crc kubenswrapper[4574]: I1004 05:08:13.703564 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-api" containerID="cri-o://9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a" gracePeriod=30 Oct 04 05:08:13 crc kubenswrapper[4574]: I1004 05:08:13.742587 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" podStartSLOduration=3.742563949 podStartE2EDuration="3.742563949s" podCreationTimestamp="2025-10-04 05:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:13.733913389 +0000 UTC m=+1319.588056431" watchObservedRunningTime="2025-10-04 05:08:13.742563949 +0000 UTC m=+1319.596706991" Oct 04 05:08:14 crc kubenswrapper[4574]: I1004 05:08:14.721950 4574 generic.go:334] "Generic (PLEG): container finished" podID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerID="0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd" exitCode=143 Oct 04 05:08:14 crc kubenswrapper[4574]: I1004 05:08:14.722025 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7","Type":"ContainerDied","Data":"0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd"} Oct 04 05:08:14 crc kubenswrapper[4574]: I1004 05:08:14.722585 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.170366 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.170877 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-central-agent" containerID="cri-o://b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d" gracePeriod=30 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.171083 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-notification-agent" containerID="cri-o://d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf" gracePeriod=30 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.171112 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="sg-core" containerID="cri-o://a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb" gracePeriod=30 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.171301 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="proxy-httpd" containerID="cri-o://36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9" gracePeriod=30 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.739901 4574 generic.go:334] "Generic (PLEG): container finished" podID="45499119-0d4a-4c71-9eb1-63484531f1af" containerID="36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9" exitCode=0 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.740182 4574 generic.go:334] "Generic (PLEG): container finished" podID="45499119-0d4a-4c71-9eb1-63484531f1af" containerID="a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb" exitCode=2 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.740193 4574 generic.go:334] "Generic (PLEG): container finished" podID="45499119-0d4a-4c71-9eb1-63484531f1af" containerID="b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d" exitCode=0 Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.739983 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerDied","Data":"36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9"} Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.740546 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerDied","Data":"a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb"} Oct 04 05:08:15 crc kubenswrapper[4574]: I1004 05:08:15.740564 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerDied","Data":"b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d"} Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.373665 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.438826 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-logs" (OuterVolumeSpecName: "logs") pod "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" (UID: "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.438889 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-logs\") pod \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.439060 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-combined-ca-bundle\") pod \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.439130 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-config-data\") pod \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.439222 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78tws\" (UniqueName: \"kubernetes.io/projected/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-kube-api-access-78tws\") pod \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\" (UID: \"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7\") " Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.439661 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.452473 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-kube-api-access-78tws" (OuterVolumeSpecName: "kube-api-access-78tws") pod "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" (UID: "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7"). InnerVolumeSpecName "kube-api-access-78tws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.476619 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-config-data" (OuterVolumeSpecName: "config-data") pod "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" (UID: "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.494586 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" (UID: "c0aa95d2-5725-4365-bb4e-ab540b0c2eb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.540967 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.541002 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.541013 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78tws\" (UniqueName: \"kubernetes.io/projected/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7-kube-api-access-78tws\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.761672 4574 generic.go:334] "Generic (PLEG): container finished" podID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerID="9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a" exitCode=0 Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.761730 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7","Type":"ContainerDied","Data":"9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a"} Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.761757 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0aa95d2-5725-4365-bb4e-ab540b0c2eb7","Type":"ContainerDied","Data":"e26111a9d3e5fbfc967bde7e5819b91db61f52122db0299655121e87fe59c32a"} Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.761791 4574 scope.go:117] "RemoveContainer" containerID="9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.761978 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.799857 4574 scope.go:117] "RemoveContainer" containerID="0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.831591 4574 scope.go:117] "RemoveContainer" containerID="9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a" Oct 04 05:08:17 crc kubenswrapper[4574]: E1004 05:08:17.832046 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a\": container with ID starting with 9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a not found: ID does not exist" containerID="9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.832092 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a"} err="failed to get container status \"9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a\": rpc error: code = NotFound desc = could not find container \"9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a\": container with ID starting with 9a353485078fe5e94d83d7b50a05a1b8d4300a13d057c5da7bb3ee1aea753f8a not found: ID does not exist" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.832119 4574 scope.go:117] "RemoveContainer" containerID="0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd" Oct 04 05:08:17 crc kubenswrapper[4574]: E1004 05:08:17.834623 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd\": container with ID starting with 0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd not found: ID does not exist" containerID="0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.834656 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd"} err="failed to get container status \"0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd\": rpc error: code = NotFound desc = could not find container \"0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd\": container with ID starting with 0086a7a4696c12525b1989c3231bd938e0896ddc87fba498786081fa9b9980cd not found: ID does not exist" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.847951 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.859532 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.879411 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:17 crc kubenswrapper[4574]: E1004 05:08:17.883392 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-api" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.883414 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-api" Oct 04 05:08:17 crc kubenswrapper[4574]: E1004 05:08:17.883435 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-log" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.883442 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-log" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.883662 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-api" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.883680 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" containerName="nova-api-log" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.884730 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.890733 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.890744 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.890811 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.893730 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.979858 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-logs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.980188 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-config-data\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.980434 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-public-tls-certs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.981517 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bp66\" (UniqueName: \"kubernetes.io/projected/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-kube-api-access-8bp66\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.981657 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:17 crc kubenswrapper[4574]: I1004 05:08:17.981821 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.007396 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.034109 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.083692 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-config-data\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.084637 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-public-tls-certs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.085529 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bp66\" (UniqueName: \"kubernetes.io/projected/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-kube-api-access-8bp66\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.085673 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.085819 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.085918 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-logs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.086352 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-logs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.089729 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.089981 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.090809 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-config-data\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.092981 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-public-tls-certs\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.113962 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bp66\" (UniqueName: \"kubernetes.io/projected/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-kube-api-access-8bp66\") pod \"nova-api-0\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.214525 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.648691 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.696813 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzs2m\" (UniqueName: \"kubernetes.io/projected/45499119-0d4a-4c71-9eb1-63484531f1af-kube-api-access-xzs2m\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.696944 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-combined-ca-bundle\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697037 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-config-data\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697113 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-run-httpd\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697560 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697188 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-ceilometer-tls-certs\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697674 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-log-httpd\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697964 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.697991 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-sg-core-conf-yaml\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.698022 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-scripts\") pod \"45499119-0d4a-4c71-9eb1-63484531f1af\" (UID: \"45499119-0d4a-4c71-9eb1-63484531f1af\") " Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.699375 4574 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.699479 4574 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45499119-0d4a-4c71-9eb1-63484531f1af-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.702894 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-scripts" (OuterVolumeSpecName: "scripts") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.706471 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45499119-0d4a-4c71-9eb1-63484531f1af-kube-api-access-xzs2m" (OuterVolumeSpecName: "kube-api-access-xzs2m") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "kube-api-access-xzs2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.745629 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.748908 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0aa95d2-5725-4365-bb4e-ab540b0c2eb7" path="/var/lib/kubelet/pods/c0aa95d2-5725-4365-bb4e-ab540b0c2eb7/volumes" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.774820 4574 generic.go:334] "Generic (PLEG): container finished" podID="45499119-0d4a-4c71-9eb1-63484531f1af" containerID="d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf" exitCode=0 Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.774854 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.774898 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerDied","Data":"d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf"} Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.774964 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45499119-0d4a-4c71-9eb1-63484531f1af","Type":"ContainerDied","Data":"7eada33b790be6bbfeee91063e1a6a4d29e7475c181ff0e05ec8ad9d93a1a8f3"} Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.774913 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.774985 4574 scope.go:117] "RemoveContainer" containerID="36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.800729 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.801643 4574 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.801677 4574 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.801752 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.802323 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzs2m\" (UniqueName: \"kubernetes.io/projected/45499119-0d4a-4c71-9eb1-63484531f1af-kube-api-access-xzs2m\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.802356 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.806163 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.806665 4574 scope.go:117] "RemoveContainer" containerID="a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.868539 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.884558 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-config-data" (OuterVolumeSpecName: "config-data") pod "45499119-0d4a-4c71-9eb1-63484531f1af" (UID: "45499119-0d4a-4c71-9eb1-63484531f1af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.905094 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45499119-0d4a-4c71-9eb1-63484531f1af-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:18 crc kubenswrapper[4574]: W1004 05:08:18.914596 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod728a2024_0cfc_4da0_876b_dc67fc3ce9aa.slice/crio-3b55790fa5764c038d62f1e83cafdcb45d2d264bcb05dc2a5c265bebc39348bf WatchSource:0}: Error finding container 3b55790fa5764c038d62f1e83cafdcb45d2d264bcb05dc2a5c265bebc39348bf: Status 404 returned error can't find the container with id 3b55790fa5764c038d62f1e83cafdcb45d2d264bcb05dc2a5c265bebc39348bf Oct 04 05:08:18 crc kubenswrapper[4574]: I1004 05:08:18.915493 4574 scope.go:117] "RemoveContainer" containerID="d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.084684 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ksgc5"] Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.085186 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-notification-agent" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.085208 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-notification-agent" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.085295 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-central-agent" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.085310 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-central-agent" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.085328 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="sg-core" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.085335 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="sg-core" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.085351 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="proxy-httpd" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.085358 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="proxy-httpd" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.087579 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-notification-agent" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.087622 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="ceilometer-central-agent" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.087639 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="sg-core" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.087651 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" containerName="proxy-httpd" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.088688 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.097705 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.098188 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.109358 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.109533 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-config-data\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.110063 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbhz\" (UniqueName: \"kubernetes.io/projected/1be85814-868f-4899-b745-70da2af4c50a-kube-api-access-4rbhz\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.110147 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-scripts\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.124083 4574 scope.go:117] "RemoveContainer" containerID="b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.129398 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ksgc5"] Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.158618 4574 scope.go:117] "RemoveContainer" containerID="36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.159108 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9\": container with ID starting with 36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9 not found: ID does not exist" containerID="36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.159350 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9"} err="failed to get container status \"36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9\": rpc error: code = NotFound desc = could not find container \"36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9\": container with ID starting with 36050863e71ae208e043868b1321d0e318eb08802259c6a5c5781ce04cafb1a9 not found: ID does not exist" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.159479 4574 scope.go:117] "RemoveContainer" containerID="a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.160129 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb\": container with ID starting with a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb not found: ID does not exist" containerID="a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.160175 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb"} err="failed to get container status \"a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb\": rpc error: code = NotFound desc = could not find container \"a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb\": container with ID starting with a36ac2107502a485f1307016a7c38f051212321821283bef1662d29d69eb2cdb not found: ID does not exist" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.160203 4574 scope.go:117] "RemoveContainer" containerID="d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.160492 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf\": container with ID starting with d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf not found: ID does not exist" containerID="d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.160529 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf"} err="failed to get container status \"d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf\": rpc error: code = NotFound desc = could not find container \"d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf\": container with ID starting with d3dc3c79cda30b0d6f4a477c63215fc376ea50358035b13a726acf84b742f3cf not found: ID does not exist" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.160548 4574 scope.go:117] "RemoveContainer" containerID="b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d" Oct 04 05:08:19 crc kubenswrapper[4574]: E1004 05:08:19.164584 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d\": container with ID starting with b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d not found: ID does not exist" containerID="b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.164637 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d"} err="failed to get container status \"b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d\": rpc error: code = NotFound desc = could not find container \"b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d\": container with ID starting with b04f5e9574bb041978ee724c400eb281bd7ae23ab264c4412c0bc1baf029de1d not found: ID does not exist" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.164690 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.177509 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.187489 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.190680 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.193839 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.195062 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.198467 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.209318 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211549 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-scripts\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211608 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbhz\" (UniqueName: \"kubernetes.io/projected/1be85814-868f-4899-b745-70da2af4c50a-kube-api-access-4rbhz\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211637 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-log-httpd\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211676 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211711 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-config-data\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211733 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-scripts\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211753 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-run-httpd\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211774 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzjj\" (UniqueName: \"kubernetes.io/projected/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-kube-api-access-dpzjj\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211797 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211848 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211891 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.211936 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-config-data\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.223255 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-scripts\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.223943 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-config-data\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.238593 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.242033 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbhz\" (UniqueName: \"kubernetes.io/projected/1be85814-868f-4899-b745-70da2af4c50a-kube-api-access-4rbhz\") pod \"nova-cell1-cell-mapping-ksgc5\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313662 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-scripts\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313733 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-log-httpd\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313769 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313838 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-config-data\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313881 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-run-httpd\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313914 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzjj\" (UniqueName: \"kubernetes.io/projected/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-kube-api-access-dpzjj\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.313946 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.314019 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.319108 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-run-httpd\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.319380 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-log-httpd\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.327638 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.327813 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-config-data\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.328076 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-scripts\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.328521 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.328934 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.338347 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzjj\" (UniqueName: \"kubernetes.io/projected/1abcd2f9-3753-4b7e-a5a3-0784ec9518f1-kube-api-access-dpzjj\") pod \"ceilometer-0\" (UID: \"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1\") " pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.404767 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.404827 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.427811 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.443525 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.711268 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57c7ff446b-7tmwn" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.711996 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.795143 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"728a2024-0cfc-4da0-876b-dc67fc3ce9aa","Type":"ContainerStarted","Data":"66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c"} Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.795196 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"728a2024-0cfc-4da0-876b-dc67fc3ce9aa","Type":"ContainerStarted","Data":"2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365"} Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.795207 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"728a2024-0cfc-4da0-876b-dc67fc3ce9aa","Type":"ContainerStarted","Data":"3b55790fa5764c038d62f1e83cafdcb45d2d264bcb05dc2a5c265bebc39348bf"} Oct 04 05:08:19 crc kubenswrapper[4574]: I1004 05:08:19.859131 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.859107511 podStartE2EDuration="2.859107511s" podCreationTimestamp="2025-10-04 05:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:19.833745387 +0000 UTC m=+1325.687888449" watchObservedRunningTime="2025-10-04 05:08:19.859107511 +0000 UTC m=+1325.713250553" Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.015783 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.054153 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ksgc5"] Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.744443 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45499119-0d4a-4c71-9eb1-63484531f1af" path="/var/lib/kubelet/pods/45499119-0d4a-4c71-9eb1-63484531f1af/volumes" Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.804370 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ksgc5" event={"ID":"1be85814-868f-4899-b745-70da2af4c50a","Type":"ContainerStarted","Data":"43f3dd5edcae0cf9cc8c87da04851f932be3b730fa7963bb0317412267235176"} Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.804414 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ksgc5" event={"ID":"1be85814-868f-4899-b745-70da2af4c50a","Type":"ContainerStarted","Data":"a14862c3c3e5acb086cec233a9489f827cfc5a28b901b0ddfc92369f69b0b613"} Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.809322 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1","Type":"ContainerStarted","Data":"1425ba4a37075bc24b5fd166c0bed1f6a5fc28bd6fb15612bd4294a1e72cc71e"} Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.809578 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1","Type":"ContainerStarted","Data":"2395fcbafc1e34a1ca5dd1e891bdaf9067b4707e0e78f1d3ceb1e778fc6b2a41"} Oct 04 05:08:20 crc kubenswrapper[4574]: I1004 05:08:20.823383 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ksgc5" podStartSLOduration=1.8233628830000002 podStartE2EDuration="1.823362883s" podCreationTimestamp="2025-10-04 05:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:20.82048005 +0000 UTC m=+1326.674623092" watchObservedRunningTime="2025-10-04 05:08:20.823362883 +0000 UTC m=+1326.677505925" Oct 04 05:08:21 crc kubenswrapper[4574]: I1004 05:08:21.403476 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:08:21 crc kubenswrapper[4574]: I1004 05:08:21.476456 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6ptbq"] Oct 04 05:08:21 crc kubenswrapper[4574]: I1004 05:08:21.476881 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerName="dnsmasq-dns" containerID="cri-o://8725e41c53a57855ab39facb6b009c34399e339c812891ca1ab473621e3e958c" gracePeriod=10 Oct 04 05:08:21 crc kubenswrapper[4574]: I1004 05:08:21.916534 4574 generic.go:334] "Generic (PLEG): container finished" podID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerID="8725e41c53a57855ab39facb6b009c34399e339c812891ca1ab473621e3e958c" exitCode=0 Oct 04 05:08:21 crc kubenswrapper[4574]: I1004 05:08:21.917053 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" event={"ID":"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567","Type":"ContainerDied","Data":"8725e41c53a57855ab39facb6b009c34399e339c812891ca1ab473621e3e958c"} Oct 04 05:08:21 crc kubenswrapper[4574]: I1004 05:08:21.932591 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1","Type":"ContainerStarted","Data":"d791c3a7e1362d8e7f4506857a0c55d9d67318b7e15446e09b2e88208d17c382"} Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.201124 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.315202 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-nb\") pod \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.315287 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgb74\" (UniqueName: \"kubernetes.io/projected/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-kube-api-access-kgb74\") pod \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.315502 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-config\") pod \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.315567 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-swift-storage-0\") pod \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.315695 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-svc\") pod \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.315725 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-sb\") pod \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\" (UID: \"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567\") " Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.351150 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-kube-api-access-kgb74" (OuterVolumeSpecName: "kube-api-access-kgb74") pod "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" (UID: "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567"). InnerVolumeSpecName "kube-api-access-kgb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.402467 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" (UID: "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.411371 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" (UID: "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.422130 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.422161 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.422171 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgb74\" (UniqueName: \"kubernetes.io/projected/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-kube-api-access-kgb74\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.422956 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" (UID: "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.423349 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-config" (OuterVolumeSpecName: "config") pod "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" (UID: "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.436571 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" (UID: "49ed2bc3-4bbe-4b52-a91d-e8de8d8da567"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.523958 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.524001 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.524014 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.947172 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" event={"ID":"49ed2bc3-4bbe-4b52-a91d-e8de8d8da567","Type":"ContainerDied","Data":"e021d66526ada5f7c37282d90bbc5d6047f6b2b1f9d65fd1c921a3ac36869071"} Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.947458 4574 scope.go:117] "RemoveContainer" containerID="8725e41c53a57855ab39facb6b009c34399e339c812891ca1ab473621e3e958c" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.947584 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-6ptbq" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.956249 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1","Type":"ContainerStarted","Data":"06e1312b50107f1953464ccdcc5aeb0ec26a04f5890cbe75a57c10e27568cdce"} Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.993998 4574 scope.go:117] "RemoveContainer" containerID="6eecca9648b3983501de382fce85b978a00b2d2501345cea2d00a2dfc63d1ccb" Oct 04 05:08:22 crc kubenswrapper[4574]: I1004 05:08:22.999393 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6ptbq"] Oct 04 05:08:23 crc kubenswrapper[4574]: I1004 05:08:23.012683 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-6ptbq"] Oct 04 05:08:23 crc kubenswrapper[4574]: I1004 05:08:23.969781 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abcd2f9-3753-4b7e-a5a3-0784ec9518f1","Type":"ContainerStarted","Data":"1e8143db8b3c0a40652f3515481745ce214027e7d786280f1300b427e0368f0a"} Oct 04 05:08:23 crc kubenswrapper[4574]: I1004 05:08:23.970141 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 04 05:08:24 crc kubenswrapper[4574]: I1004 05:08:24.005465 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.865928305 podStartE2EDuration="5.005441803s" podCreationTimestamp="2025-10-04 05:08:19 +0000 UTC" firstStartedPulling="2025-10-04 05:08:20.046856076 +0000 UTC m=+1325.900999118" lastFinishedPulling="2025-10-04 05:08:23.186369574 +0000 UTC m=+1329.040512616" observedRunningTime="2025-10-04 05:08:23.993615351 +0000 UTC m=+1329.847758393" watchObservedRunningTime="2025-10-04 05:08:24.005441803 +0000 UTC m=+1329.859584845" Oct 04 05:08:24 crc kubenswrapper[4574]: I1004 05:08:24.745070 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" path="/var/lib/kubelet/pods/49ed2bc3-4bbe-4b52-a91d-e8de8d8da567/volumes" Oct 04 05:08:27 crc kubenswrapper[4574]: I1004 05:08:27.014971 4574 generic.go:334] "Generic (PLEG): container finished" podID="1be85814-868f-4899-b745-70da2af4c50a" containerID="43f3dd5edcae0cf9cc8c87da04851f932be3b730fa7963bb0317412267235176" exitCode=0 Oct 04 05:08:27 crc kubenswrapper[4574]: I1004 05:08:27.015299 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ksgc5" event={"ID":"1be85814-868f-4899-b745-70da2af4c50a","Type":"ContainerDied","Data":"43f3dd5edcae0cf9cc8c87da04851f932be3b730fa7963bb0317412267235176"} Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:27.864519 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.051615 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-secret-key\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.052083 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eac9c0-22fc-4c42-93ab-0734f058a121-logs\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.052175 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-combined-ca-bundle\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.052218 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-config-data\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.052267 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-tls-certs\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.052293 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6mbx\" (UniqueName: \"kubernetes.io/projected/56eac9c0-22fc-4c42-93ab-0734f058a121-kube-api-access-r6mbx\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.052357 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-scripts\") pod \"56eac9c0-22fc-4c42-93ab-0734f058a121\" (UID: \"56eac9c0-22fc-4c42-93ab-0734f058a121\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.056393 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56eac9c0-22fc-4c42-93ab-0734f058a121-logs" (OuterVolumeSpecName: "logs") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.059345 4574 generic.go:334] "Generic (PLEG): container finished" podID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerID="98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa" exitCode=137 Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.059446 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57c7ff446b-7tmwn" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.059567 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerDied","Data":"98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa"} Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.059618 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57c7ff446b-7tmwn" event={"ID":"56eac9c0-22fc-4c42-93ab-0734f058a121","Type":"ContainerDied","Data":"ded04dd5820ce830aca3e3f2c4b13ba72639bd915330f9ef579a98c2f911b1ef"} Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.059674 4574 scope.go:117] "RemoveContainer" containerID="403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.071780 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.078686 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56eac9c0-22fc-4c42-93ab-0734f058a121-kube-api-access-r6mbx" (OuterVolumeSpecName: "kube-api-access-r6mbx") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "kube-api-access-r6mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.129330 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-config-data" (OuterVolumeSpecName: "config-data") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.131907 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.152877 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-scripts" (OuterVolumeSpecName: "scripts") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.154470 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.154492 4574 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.154506 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56eac9c0-22fc-4c42-93ab-0734f058a121-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.154518 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.154529 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56eac9c0-22fc-4c42-93ab-0734f058a121-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.154539 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6mbx\" (UniqueName: \"kubernetes.io/projected/56eac9c0-22fc-4c42-93ab-0734f058a121-kube-api-access-r6mbx\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.166222 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "56eac9c0-22fc-4c42-93ab-0734f058a121" (UID: "56eac9c0-22fc-4c42-93ab-0734f058a121"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.215831 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.216291 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.258265 4574 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/56eac9c0-22fc-4c42-93ab-0734f058a121-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.435167 4574 scope.go:117] "RemoveContainer" containerID="98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.462927 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57c7ff446b-7tmwn"] Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.465565 4574 scope.go:117] "RemoveContainer" containerID="403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469" Oct 04 05:08:28 crc kubenswrapper[4574]: E1004 05:08:28.467529 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469\": container with ID starting with 403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469 not found: ID does not exist" containerID="403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.467582 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469"} err="failed to get container status \"403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469\": rpc error: code = NotFound desc = could not find container \"403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469\": container with ID starting with 403567605bb093302cae9568fd8f1668a8074cbc89b169e57f187610edf70469 not found: ID does not exist" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.467615 4574 scope.go:117] "RemoveContainer" containerID="98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa" Oct 04 05:08:28 crc kubenswrapper[4574]: E1004 05:08:28.467883 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa\": container with ID starting with 98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa not found: ID does not exist" containerID="98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.467906 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa"} err="failed to get container status \"98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa\": rpc error: code = NotFound desc = could not find container \"98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa\": container with ID starting with 98d9849b8eeb0c129aaad0ba5dea4d3dd934e853e64e003c87feeb224f0deaaa not found: ID does not exist" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.470961 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57c7ff446b-7tmwn"] Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.747474 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" path="/var/lib/kubelet/pods/56eac9c0-22fc-4c42-93ab-0734f058a121/volumes" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.803560 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.975079 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-config-data\") pod \"1be85814-868f-4899-b745-70da2af4c50a\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.975259 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-combined-ca-bundle\") pod \"1be85814-868f-4899-b745-70da2af4c50a\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.975362 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-scripts\") pod \"1be85814-868f-4899-b745-70da2af4c50a\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.975518 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rbhz\" (UniqueName: \"kubernetes.io/projected/1be85814-868f-4899-b745-70da2af4c50a-kube-api-access-4rbhz\") pod \"1be85814-868f-4899-b745-70da2af4c50a\" (UID: \"1be85814-868f-4899-b745-70da2af4c50a\") " Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.980740 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be85814-868f-4899-b745-70da2af4c50a-kube-api-access-4rbhz" (OuterVolumeSpecName: "kube-api-access-4rbhz") pod "1be85814-868f-4899-b745-70da2af4c50a" (UID: "1be85814-868f-4899-b745-70da2af4c50a"). InnerVolumeSpecName "kube-api-access-4rbhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:28 crc kubenswrapper[4574]: I1004 05:08:28.990395 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-scripts" (OuterVolumeSpecName: "scripts") pod "1be85814-868f-4899-b745-70da2af4c50a" (UID: "1be85814-868f-4899-b745-70da2af4c50a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.014849 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-config-data" (OuterVolumeSpecName: "config-data") pod "1be85814-868f-4899-b745-70da2af4c50a" (UID: "1be85814-868f-4899-b745-70da2af4c50a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.024483 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1be85814-868f-4899-b745-70da2af4c50a" (UID: "1be85814-868f-4899-b745-70da2af4c50a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.078051 4574 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-scripts\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.078094 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rbhz\" (UniqueName: \"kubernetes.io/projected/1be85814-868f-4899-b745-70da2af4c50a-kube-api-access-4rbhz\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.078109 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.078122 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be85814-868f-4899-b745-70da2af4c50a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.080799 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ksgc5" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.080826 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ksgc5" event={"ID":"1be85814-868f-4899-b745-70da2af4c50a","Type":"ContainerDied","Data":"a14862c3c3e5acb086cec233a9489f827cfc5a28b901b0ddfc92369f69b0b613"} Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.080905 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a14862c3c3e5acb086cec233a9489f827cfc5a28b901b0ddfc92369f69b0b613" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.221573 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.238886 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.239281 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.248204 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.248451 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a0e56256-b1dd-46b7-a662-80a85f177980" containerName="nova-scheduler-scheduler" containerID="cri-o://37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" gracePeriod=30 Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.284430 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.284634 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-log" containerID="cri-o://2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91" gracePeriod=30 Oct 04 05:08:29 crc kubenswrapper[4574]: I1004 05:08:29.284892 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-metadata" containerID="cri-o://9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b" gracePeriod=30 Oct 04 05:08:30 crc kubenswrapper[4574]: I1004 05:08:30.093824 4574 generic.go:334] "Generic (PLEG): container finished" podID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerID="2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91" exitCode=143 Oct 04 05:08:30 crc kubenswrapper[4574]: I1004 05:08:30.093905 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc","Type":"ContainerDied","Data":"2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91"} Oct 04 05:08:30 crc kubenswrapper[4574]: I1004 05:08:30.094047 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-log" containerID="cri-o://2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365" gracePeriod=30 Oct 04 05:08:30 crc kubenswrapper[4574]: I1004 05:08:30.094156 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-api" containerID="cri-o://66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c" gracePeriod=30 Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.104915 4574 generic.go:334] "Generic (PLEG): container finished" podID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerID="2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365" exitCode=143 Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.104974 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"728a2024-0cfc-4da0-876b-dc67fc3ce9aa","Type":"ContainerDied","Data":"2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365"} Oct 04 05:08:31 crc kubenswrapper[4574]: E1004 05:08:31.772106 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1 is running failed: container process not found" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 04 05:08:31 crc kubenswrapper[4574]: E1004 05:08:31.775030 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1 is running failed: container process not found" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 04 05:08:31 crc kubenswrapper[4574]: E1004 05:08:31.775350 4574 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1 is running failed: container process not found" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 04 05:08:31 crc kubenswrapper[4574]: E1004 05:08:31.775437 4574 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a0e56256-b1dd-46b7-a662-80a85f177980" containerName="nova-scheduler-scheduler" Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.867680 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.959449 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-combined-ca-bundle\") pod \"a0e56256-b1dd-46b7-a662-80a85f177980\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.959498 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-config-data\") pod \"a0e56256-b1dd-46b7-a662-80a85f177980\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.959703 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xznxr\" (UniqueName: \"kubernetes.io/projected/a0e56256-b1dd-46b7-a662-80a85f177980-kube-api-access-xznxr\") pod \"a0e56256-b1dd-46b7-a662-80a85f177980\" (UID: \"a0e56256-b1dd-46b7-a662-80a85f177980\") " Oct 04 05:08:31 crc kubenswrapper[4574]: I1004 05:08:31.965009 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e56256-b1dd-46b7-a662-80a85f177980-kube-api-access-xznxr" (OuterVolumeSpecName: "kube-api-access-xznxr") pod "a0e56256-b1dd-46b7-a662-80a85f177980" (UID: "a0e56256-b1dd-46b7-a662-80a85f177980"). InnerVolumeSpecName "kube-api-access-xznxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.003689 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-config-data" (OuterVolumeSpecName: "config-data") pod "a0e56256-b1dd-46b7-a662-80a85f177980" (UID: "a0e56256-b1dd-46b7-a662-80a85f177980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.011149 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0e56256-b1dd-46b7-a662-80a85f177980" (UID: "a0e56256-b1dd-46b7-a662-80a85f177980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.063505 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.063536 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e56256-b1dd-46b7-a662-80a85f177980-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.063556 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xznxr\" (UniqueName: \"kubernetes.io/projected/a0e56256-b1dd-46b7-a662-80a85f177980-kube-api-access-xznxr\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.114362 4574 generic.go:334] "Generic (PLEG): container finished" podID="a0e56256-b1dd-46b7-a662-80a85f177980" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" exitCode=0 Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.114406 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a0e56256-b1dd-46b7-a662-80a85f177980","Type":"ContainerDied","Data":"37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1"} Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.114423 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.114434 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a0e56256-b1dd-46b7-a662-80a85f177980","Type":"ContainerDied","Data":"51a8812350313affd6635bae1a0938d3825682b4fb0d0bba81635a19ce43ea62"} Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.114449 4574 scope.go:117] "RemoveContainer" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.138848 4574 scope.go:117] "RemoveContainer" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.139494 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1\": container with ID starting with 37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1 not found: ID does not exist" containerID="37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.139550 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1"} err="failed to get container status \"37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1\": rpc error: code = NotFound desc = could not find container \"37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1\": container with ID starting with 37e6614ec2a8954edb97378a19cb51be7f542ecbc425bf7142086b91f3ca83a1 not found: ID does not exist" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.154521 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.162066 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175389 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175756 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175773 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175786 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerName="dnsmasq-dns" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175792 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerName="dnsmasq-dns" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175819 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon-log" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175824 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon-log" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175833 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175839 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175849 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerName="init" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175857 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerName="init" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175880 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e56256-b1dd-46b7-a662-80a85f177980" containerName="nova-scheduler-scheduler" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175887 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e56256-b1dd-46b7-a662-80a85f177980" containerName="nova-scheduler-scheduler" Oct 04 05:08:32 crc kubenswrapper[4574]: E1004 05:08:32.175901 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be85814-868f-4899-b745-70da2af4c50a" containerName="nova-manage" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.175907 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be85814-868f-4899-b745-70da2af4c50a" containerName="nova-manage" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176067 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ed2bc3-4bbe-4b52-a91d-e8de8d8da567" containerName="dnsmasq-dns" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176085 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176095 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be85814-868f-4899-b745-70da2af4c50a" containerName="nova-manage" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176109 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176115 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e56256-b1dd-46b7-a662-80a85f177980" containerName="nova-scheduler-scheduler" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176128 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176139 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon-log" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.176838 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.185443 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.230771 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.266769 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-config-data\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.266877 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.266915 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llc7j\" (UniqueName: \"kubernetes.io/projected/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-kube-api-access-llc7j\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.368035 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.368097 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llc7j\" (UniqueName: \"kubernetes.io/projected/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-kube-api-access-llc7j\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.368182 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-config-data\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.372820 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.372828 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-config-data\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.385850 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llc7j\" (UniqueName: \"kubernetes.io/projected/48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c-kube-api-access-llc7j\") pod \"nova-scheduler-0\" (UID: \"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c\") " pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.495093 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.680486 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": dial tcp 10.217.0.195:8775: connect: connection refused" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.681503 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": dial tcp 10.217.0.195:8775: connect: connection refused" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.756060 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e56256-b1dd-46b7-a662-80a85f177980" path="/var/lib/kubelet/pods/a0e56256-b1dd-46b7-a662-80a85f177980/volumes" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.946604 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.978593 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-logs\") pod \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.978914 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-config-data\") pod \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.979188 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cdl4\" (UniqueName: \"kubernetes.io/projected/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-kube-api-access-4cdl4\") pod \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.979344 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-nova-metadata-tls-certs\") pod \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.979732 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-combined-ca-bundle\") pod \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\" (UID: \"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc\") " Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.979495 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-logs" (OuterVolumeSpecName: "logs") pod "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" (UID: "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.980602 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:32 crc kubenswrapper[4574]: I1004 05:08:32.998404 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-kube-api-access-4cdl4" (OuterVolumeSpecName: "kube-api-access-4cdl4") pod "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" (UID: "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc"). InnerVolumeSpecName "kube-api-access-4cdl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.021747 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" (UID: "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.038103 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-config-data" (OuterVolumeSpecName: "config-data") pod "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" (UID: "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.053612 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" (UID: "0c0b789f-0313-4df9-8a95-cfd4ec60f6dc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.082274 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.082315 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.082326 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cdl4\" (UniqueName: \"kubernetes.io/projected/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-kube-api-access-4cdl4\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.082338 4574 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:33 crc kubenswrapper[4574]: W1004 05:08:33.126000 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48cf5b3d_b1a3_4c9c_b2bb_82e54ca8519c.slice/crio-976be99a1c0abced253b584ae1b7bace477f4cd2ab6cb2e737df79183b486d43 WatchSource:0}: Error finding container 976be99a1c0abced253b584ae1b7bace477f4cd2ab6cb2e737df79183b486d43: Status 404 returned error can't find the container with id 976be99a1c0abced253b584ae1b7bace477f4cd2ab6cb2e737df79183b486d43 Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.126312 4574 generic.go:334] "Generic (PLEG): container finished" podID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerID="9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b" exitCode=0 Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.126399 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.126398 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc","Type":"ContainerDied","Data":"9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b"} Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.126507 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c0b789f-0313-4df9-8a95-cfd4ec60f6dc","Type":"ContainerDied","Data":"03d84ea732ce47e319d6658dc24a8c7531359676f9ccf76b35db0917d5a256d0"} Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.126550 4574 scope.go:117] "RemoveContainer" containerID="9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.127187 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.159312 4574 scope.go:117] "RemoveContainer" containerID="2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.175696 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.210733 4574 scope.go:117] "RemoveContainer" containerID="9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.214090 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:08:33 crc kubenswrapper[4574]: E1004 05:08:33.214330 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b\": container with ID starting with 9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b not found: ID does not exist" containerID="9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.214368 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b"} err="failed to get container status \"9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b\": rpc error: code = NotFound desc = could not find container \"9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b\": container with ID starting with 9b15923f6b91626c77927f14e5b223f818504631b509401b628fa2cad37fc01b not found: ID does not exist" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.214394 4574 scope.go:117] "RemoveContainer" containerID="2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91" Oct 04 05:08:33 crc kubenswrapper[4574]: E1004 05:08:33.214888 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91\": container with ID starting with 2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91 not found: ID does not exist" containerID="2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.214922 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91"} err="failed to get container status \"2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91\": rpc error: code = NotFound desc = could not find container \"2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91\": container with ID starting with 2e729a3e5d7b094328af7a32134e5968d4e39fc045c71a6adfe99da5d14a1f91 not found: ID does not exist" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.230189 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:08:33 crc kubenswrapper[4574]: E1004 05:08:33.230724 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-metadata" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.230748 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-metadata" Oct 04 05:08:33 crc kubenswrapper[4574]: E1004 05:08:33.230791 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.230802 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eac9c0-22fc-4c42-93ab-0734f058a121" containerName="horizon" Oct 04 05:08:33 crc kubenswrapper[4574]: E1004 05:08:33.230823 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-log" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.230833 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-log" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.231096 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-log" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.231119 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" containerName="nova-metadata-metadata" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.232452 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.234780 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.235049 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.240993 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.287834 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn68\" (UniqueName: \"kubernetes.io/projected/7e8c70bd-bcf3-4379-a026-5a52411a56ab-kube-api-access-ncn68\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.288024 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-config-data\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.288531 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.288780 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.288892 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e8c70bd-bcf3-4379-a026-5a52411a56ab-logs\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.391078 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e8c70bd-bcf3-4379-a026-5a52411a56ab-logs\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.391446 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e8c70bd-bcf3-4379-a026-5a52411a56ab-logs\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.391596 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncn68\" (UniqueName: \"kubernetes.io/projected/7e8c70bd-bcf3-4379-a026-5a52411a56ab-kube-api-access-ncn68\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.391947 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-config-data\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.392495 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.392608 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.396173 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-config-data\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.396997 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.397342 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e8c70bd-bcf3-4379-a026-5a52411a56ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.408598 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncn68\" (UniqueName: \"kubernetes.io/projected/7e8c70bd-bcf3-4379-a026-5a52411a56ab-kube-api-access-ncn68\") pod \"nova-metadata-0\" (UID: \"7e8c70bd-bcf3-4379-a026-5a52411a56ab\") " pod="openstack/nova-metadata-0" Oct 04 05:08:33 crc kubenswrapper[4574]: I1004 05:08:33.570147 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 04 05:08:34 crc kubenswrapper[4574]: I1004 05:08:34.028925 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 04 05:08:34 crc kubenswrapper[4574]: I1004 05:08:34.141698 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e8c70bd-bcf3-4379-a026-5a52411a56ab","Type":"ContainerStarted","Data":"492c2576555863ff46b56c9b7460f4906a3096b076f0ea391195a63425d9a470"} Oct 04 05:08:34 crc kubenswrapper[4574]: I1004 05:08:34.142921 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c","Type":"ContainerStarted","Data":"cc873be85a032b214178fbdbf3e82881b4bd4057363057314a8bb06910c1f846"} Oct 04 05:08:34 crc kubenswrapper[4574]: I1004 05:08:34.142947 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c","Type":"ContainerStarted","Data":"976be99a1c0abced253b584ae1b7bace477f4cd2ab6cb2e737df79183b486d43"} Oct 04 05:08:34 crc kubenswrapper[4574]: I1004 05:08:34.162425 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.16238684 podStartE2EDuration="2.16238684s" podCreationTimestamp="2025-10-04 05:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:34.162247426 +0000 UTC m=+1340.016390468" watchObservedRunningTime="2025-10-04 05:08:34.16238684 +0000 UTC m=+1340.016529882" Oct 04 05:08:34 crc kubenswrapper[4574]: I1004 05:08:34.764607 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0b789f-0313-4df9-8a95-cfd4ec60f6dc" path="/var/lib/kubelet/pods/0c0b789f-0313-4df9-8a95-cfd4ec60f6dc/volumes" Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.153443 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e8c70bd-bcf3-4379-a026-5a52411a56ab","Type":"ContainerStarted","Data":"d2a4e02559b5eb0159d1cc7b629054fb874502b3eb269ee63d3c88132319f90e"} Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.153498 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e8c70bd-bcf3-4379-a026-5a52411a56ab","Type":"ContainerStarted","Data":"95179f4f7c752322565a5f12b9cce0d23f1065bd7cbea696bf50e9f0f73817cb"} Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.180847 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.180826891 podStartE2EDuration="2.180826891s" podCreationTimestamp="2025-10-04 05:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:35.17839187 +0000 UTC m=+1341.032534932" watchObservedRunningTime="2025-10-04 05:08:35.180826891 +0000 UTC m=+1341.034969933" Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.895976 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.948564 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-internal-tls-certs\") pod \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.948653 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-logs\") pod \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.948724 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bp66\" (UniqueName: \"kubernetes.io/projected/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-kube-api-access-8bp66\") pod \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.948878 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-combined-ca-bundle\") pod \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.948934 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-public-tls-certs\") pod \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.948978 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-config-data\") pod \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\" (UID: \"728a2024-0cfc-4da0-876b-dc67fc3ce9aa\") " Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.949730 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-logs" (OuterVolumeSpecName: "logs") pod "728a2024-0cfc-4da0-876b-dc67fc3ce9aa" (UID: "728a2024-0cfc-4da0-876b-dc67fc3ce9aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.954384 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-kube-api-access-8bp66" (OuterVolumeSpecName: "kube-api-access-8bp66") pod "728a2024-0cfc-4da0-876b-dc67fc3ce9aa" (UID: "728a2024-0cfc-4da0-876b-dc67fc3ce9aa"). InnerVolumeSpecName "kube-api-access-8bp66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:08:35 crc kubenswrapper[4574]: I1004 05:08:35.981777 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-config-data" (OuterVolumeSpecName: "config-data") pod "728a2024-0cfc-4da0-876b-dc67fc3ce9aa" (UID: "728a2024-0cfc-4da0-876b-dc67fc3ce9aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.002465 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728a2024-0cfc-4da0-876b-dc67fc3ce9aa" (UID: "728a2024-0cfc-4da0-876b-dc67fc3ce9aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.007839 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "728a2024-0cfc-4da0-876b-dc67fc3ce9aa" (UID: "728a2024-0cfc-4da0-876b-dc67fc3ce9aa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.020692 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "728a2024-0cfc-4da0-876b-dc67fc3ce9aa" (UID: "728a2024-0cfc-4da0-876b-dc67fc3ce9aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.051469 4574 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.051745 4574 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-logs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.051755 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bp66\" (UniqueName: \"kubernetes.io/projected/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-kube-api-access-8bp66\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.051765 4574 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.051772 4574 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.051780 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a2024-0cfc-4da0-876b-dc67fc3ce9aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.167209 4574 generic.go:334] "Generic (PLEG): container finished" podID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerID="66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c" exitCode=0 Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.167542 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.168184 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"728a2024-0cfc-4da0-876b-dc67fc3ce9aa","Type":"ContainerDied","Data":"66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c"} Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.168222 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"728a2024-0cfc-4da0-876b-dc67fc3ce9aa","Type":"ContainerDied","Data":"3b55790fa5764c038d62f1e83cafdcb45d2d264bcb05dc2a5c265bebc39348bf"} Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.168265 4574 scope.go:117] "RemoveContainer" containerID="66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.202397 4574 scope.go:117] "RemoveContainer" containerID="2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.205954 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.229384 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.239908 4574 scope.go:117] "RemoveContainer" containerID="66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c" Oct 04 05:08:36 crc kubenswrapper[4574]: E1004 05:08:36.240351 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c\": container with ID starting with 66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c not found: ID does not exist" containerID="66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.240595 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c"} err="failed to get container status \"66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c\": rpc error: code = NotFound desc = could not find container \"66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c\": container with ID starting with 66ee6f506fc32f1c9c6e68867264b23a25d59c2b6d350cfe3861bb4918614a1c not found: ID does not exist" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.240624 4574 scope.go:117] "RemoveContainer" containerID="2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365" Oct 04 05:08:36 crc kubenswrapper[4574]: E1004 05:08:36.240926 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365\": container with ID starting with 2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365 not found: ID does not exist" containerID="2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.240956 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365"} err="failed to get container status \"2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365\": rpc error: code = NotFound desc = could not find container \"2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365\": container with ID starting with 2c3ec0fe074b0e8882c705342abc70d6fdce7256fd03e373a73e6f39af9c7365 not found: ID does not exist" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.244792 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:36 crc kubenswrapper[4574]: E1004 05:08:36.245354 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-log" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.245378 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-log" Oct 04 05:08:36 crc kubenswrapper[4574]: E1004 05:08:36.245644 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-api" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.245655 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-api" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.245924 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-log" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.245964 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" containerName="nova-api-api" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.248565 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.253759 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.254167 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.254381 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.257843 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.357810 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97365d9d-d7a3-42b9-8131-54dea698f6f8-logs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.357874 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.357974 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-config-data\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.358022 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.358072 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59ns5\" (UniqueName: \"kubernetes.io/projected/97365d9d-d7a3-42b9-8131-54dea698f6f8-kube-api-access-59ns5\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.358122 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.461161 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-config-data\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.461308 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.461354 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59ns5\" (UniqueName: \"kubernetes.io/projected/97365d9d-d7a3-42b9-8131-54dea698f6f8-kube-api-access-59ns5\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.461435 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.461475 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97365d9d-d7a3-42b9-8131-54dea698f6f8-logs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.461523 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.463668 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97365d9d-d7a3-42b9-8131-54dea698f6f8-logs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.466621 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.467896 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-config-data\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.467934 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.468346 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97365d9d-d7a3-42b9-8131-54dea698f6f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.480822 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59ns5\" (UniqueName: \"kubernetes.io/projected/97365d9d-d7a3-42b9-8131-54dea698f6f8-kube-api-access-59ns5\") pod \"nova-api-0\" (UID: \"97365d9d-d7a3-42b9-8131-54dea698f6f8\") " pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.579636 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 04 05:08:36 crc kubenswrapper[4574]: I1004 05:08:36.750155 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728a2024-0cfc-4da0-876b-dc67fc3ce9aa" path="/var/lib/kubelet/pods/728a2024-0cfc-4da0-876b-dc67fc3ce9aa/volumes" Oct 04 05:08:37 crc kubenswrapper[4574]: I1004 05:08:37.018986 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 04 05:08:37 crc kubenswrapper[4574]: W1004 05:08:37.037828 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97365d9d_d7a3_42b9_8131_54dea698f6f8.slice/crio-cc3fc61dc11d47b96dcbaf962adcc8eb815ac2ada15a72da75127b86a088ba05 WatchSource:0}: Error finding container cc3fc61dc11d47b96dcbaf962adcc8eb815ac2ada15a72da75127b86a088ba05: Status 404 returned error can't find the container with id cc3fc61dc11d47b96dcbaf962adcc8eb815ac2ada15a72da75127b86a088ba05 Oct 04 05:08:37 crc kubenswrapper[4574]: I1004 05:08:37.177897 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97365d9d-d7a3-42b9-8131-54dea698f6f8","Type":"ContainerStarted","Data":"cc3fc61dc11d47b96dcbaf962adcc8eb815ac2ada15a72da75127b86a088ba05"} Oct 04 05:08:37 crc kubenswrapper[4574]: I1004 05:08:37.495428 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 04 05:08:38 crc kubenswrapper[4574]: I1004 05:08:38.188333 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97365d9d-d7a3-42b9-8131-54dea698f6f8","Type":"ContainerStarted","Data":"5c738bd128f3e6e518f7a070005ab46aded6507d699e94d784476a66c2e1a585"} Oct 04 05:08:38 crc kubenswrapper[4574]: I1004 05:08:38.188666 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97365d9d-d7a3-42b9-8131-54dea698f6f8","Type":"ContainerStarted","Data":"c3723cb17236c53ad732ec1284e8bb3aa4fc059674556f08742e8460e4c976bd"} Oct 04 05:08:38 crc kubenswrapper[4574]: I1004 05:08:38.206483 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.206463731 podStartE2EDuration="2.206463731s" podCreationTimestamp="2025-10-04 05:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:08:38.20366427 +0000 UTC m=+1344.057807312" watchObservedRunningTime="2025-10-04 05:08:38.206463731 +0000 UTC m=+1344.060606773" Oct 04 05:08:38 crc kubenswrapper[4574]: I1004 05:08:38.570367 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 05:08:38 crc kubenswrapper[4574]: I1004 05:08:38.571338 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 04 05:08:42 crc kubenswrapper[4574]: I1004 05:08:42.495770 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 04 05:08:42 crc kubenswrapper[4574]: I1004 05:08:42.522952 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 04 05:08:43 crc kubenswrapper[4574]: I1004 05:08:43.263241 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 04 05:08:43 crc kubenswrapper[4574]: I1004 05:08:43.570421 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 05:08:43 crc kubenswrapper[4574]: I1004 05:08:43.570470 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 04 05:08:44 crc kubenswrapper[4574]: I1004 05:08:44.583467 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7e8c70bd-bcf3-4379-a026-5a52411a56ab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:44 crc kubenswrapper[4574]: I1004 05:08:44.584080 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7e8c70bd-bcf3-4379-a026-5a52411a56ab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:46 crc kubenswrapper[4574]: I1004 05:08:46.580797 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:08:46 crc kubenswrapper[4574]: I1004 05:08:46.581097 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 04 05:08:47 crc kubenswrapper[4574]: I1004 05:08:47.592420 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97365d9d-d7a3-42b9-8131-54dea698f6f8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:47 crc kubenswrapper[4574]: I1004 05:08:47.592461 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97365d9d-d7a3-42b9-8131-54dea698f6f8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 04 05:08:49 crc kubenswrapper[4574]: I1004 05:08:49.405162 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:08:49 crc kubenswrapper[4574]: I1004 05:08:49.405490 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:08:49 crc kubenswrapper[4574]: I1004 05:08:49.405537 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:08:49 crc kubenswrapper[4574]: I1004 05:08:49.406264 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"422e81dba527fdce5fb46863c657f7a61bb4d0e601b192c209383ffeaf65198f"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:08:49 crc kubenswrapper[4574]: I1004 05:08:49.406325 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://422e81dba527fdce5fb46863c657f7a61bb4d0e601b192c209383ffeaf65198f" gracePeriod=600 Oct 04 05:08:49 crc kubenswrapper[4574]: I1004 05:08:49.461254 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 04 05:08:50 crc kubenswrapper[4574]: I1004 05:08:50.293102 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="422e81dba527fdce5fb46863c657f7a61bb4d0e601b192c209383ffeaf65198f" exitCode=0 Oct 04 05:08:50 crc kubenswrapper[4574]: I1004 05:08:50.293651 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"422e81dba527fdce5fb46863c657f7a61bb4d0e601b192c209383ffeaf65198f"} Oct 04 05:08:50 crc kubenswrapper[4574]: I1004 05:08:50.293679 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc"} Oct 04 05:08:50 crc kubenswrapper[4574]: I1004 05:08:50.293695 4574 scope.go:117] "RemoveContainer" containerID="0c021ed99dab79e0bc143879c96505d7aa34ab49c6d5b17fbf9b9b39bbe04b86" Oct 04 05:08:53 crc kubenswrapper[4574]: I1004 05:08:53.576856 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 05:08:53 crc kubenswrapper[4574]: I1004 05:08:53.583515 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 04 05:08:53 crc kubenswrapper[4574]: I1004 05:08:53.584764 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 05:08:54 crc kubenswrapper[4574]: I1004 05:08:54.333701 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 04 05:08:56 crc kubenswrapper[4574]: I1004 05:08:56.586960 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 05:08:56 crc kubenswrapper[4574]: I1004 05:08:56.588342 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 04 05:08:56 crc kubenswrapper[4574]: I1004 05:08:56.588595 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 05:08:56 crc kubenswrapper[4574]: I1004 05:08:56.588646 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 04 05:08:56 crc kubenswrapper[4574]: I1004 05:08:56.595073 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 05:08:56 crc kubenswrapper[4574]: I1004 05:08:56.595201 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 04 05:09:04 crc kubenswrapper[4574]: I1004 05:09:04.687813 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:09:05 crc kubenswrapper[4574]: I1004 05:09:05.789053 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:09:09 crc kubenswrapper[4574]: I1004 05:09:09.956837 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="rabbitmq" containerID="cri-o://d5655f18f8668b0b0b32b184f1ce68bc9a08312686a470d75b2f8870edb99e71" gracePeriod=604795 Oct 04 05:09:10 crc kubenswrapper[4574]: I1004 05:09:10.309063 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="rabbitmq" containerID="cri-o://1278d15b8a81afdd76322b79acff8b815745598554a13cb237126afb3b6e9dd6" gracePeriod=604796 Oct 04 05:09:13 crc kubenswrapper[4574]: I1004 05:09:13.699124 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 04 05:09:14 crc kubenswrapper[4574]: I1004 05:09:14.254401 4574 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.530310 4574 generic.go:334] "Generic (PLEG): container finished" podID="16df8292-9780-4212-a920-bf0eed95da87" containerID="d5655f18f8668b0b0b32b184f1ce68bc9a08312686a470d75b2f8870edb99e71" exitCode=0 Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.530933 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"16df8292-9780-4212-a920-bf0eed95da87","Type":"ContainerDied","Data":"d5655f18f8668b0b0b32b184f1ce68bc9a08312686a470d75b2f8870edb99e71"} Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.530977 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"16df8292-9780-4212-a920-bf0eed95da87","Type":"ContainerDied","Data":"0535ae0034bb52957ed5134e371173486b6506562458def59ef1a9efe987e125"} Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.530987 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0535ae0034bb52957ed5134e371173486b6506562458def59ef1a9efe987e125" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.535653 4574 generic.go:334] "Generic (PLEG): container finished" podID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerID="1278d15b8a81afdd76322b79acff8b815745598554a13cb237126afb3b6e9dd6" exitCode=0 Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.535692 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e3699c-e19d-4c38-b763-32af874a1a90","Type":"ContainerDied","Data":"1278d15b8a81afdd76322b79acff8b815745598554a13cb237126afb3b6e9dd6"} Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.595714 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.663768 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-server-conf\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.663813 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.663849 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-erlang-cookie\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.663869 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-config-data\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.663888 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-plugins\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.663938 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs9x9\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-kube-api-access-xs9x9\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.664150 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-confd\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.664260 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-plugins-conf\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.664284 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16df8292-9780-4212-a920-bf0eed95da87-erlang-cookie-secret\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.664312 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16df8292-9780-4212-a920-bf0eed95da87-pod-info\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.664345 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-tls\") pod \"16df8292-9780-4212-a920-bf0eed95da87\" (UID: \"16df8292-9780-4212-a920-bf0eed95da87\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.668796 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.669056 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.671168 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.719205 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16df8292-9780-4212-a920-bf0eed95da87-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.719445 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.735959 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-kube-api-access-xs9x9" (OuterVolumeSpecName: "kube-api-access-xs9x9") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "kube-api-access-xs9x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.736048 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.748593 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/16df8292-9780-4212-a920-bf0eed95da87-pod-info" (OuterVolumeSpecName: "pod-info") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.773963 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774455 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774477 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774489 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs9x9\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-kube-api-access-xs9x9\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774499 4574 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774507 4574 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16df8292-9780-4212-a920-bf0eed95da87-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774517 4574 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16df8292-9780-4212-a920-bf0eed95da87-pod-info\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.774526 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.784133 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-config-data" (OuterVolumeSpecName: "config-data") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.801313 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.825062 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.844811 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-server-conf" (OuterVolumeSpecName: "server-conf") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.878268 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-config-data\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.878671 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-tls\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.878811 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.878936 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e3699c-e19d-4c38-b763-32af874a1a90-erlang-cookie-secret\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.878977 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-plugins\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.878996 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-server-conf\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.879066 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e3699c-e19d-4c38-b763-32af874a1a90-pod-info\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.879090 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c2vn\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-kube-api-access-8c2vn\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.879114 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-confd\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.879145 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-erlang-cookie\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.879176 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-plugins-conf\") pod \"d3e3699c-e19d-4c38-b763-32af874a1a90\" (UID: \"d3e3699c-e19d-4c38-b763-32af874a1a90\") " Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.881428 4574 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-server-conf\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.881452 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.881467 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16df8292-9780-4212-a920-bf0eed95da87-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.887331 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.887729 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.900670 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.926071 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.937660 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-config-data" (OuterVolumeSpecName: "config-data") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.940714 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e3699c-e19d-4c38-b763-32af874a1a90-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.967052 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:16 crc kubenswrapper[4574]: I1004 05:09:16.974293 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-kube-api-access-8c2vn" (OuterVolumeSpecName: "kube-api-access-8c2vn") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "kube-api-access-8c2vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004377 4574 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004413 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004422 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004451 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004462 4574 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e3699c-e19d-4c38-b763-32af874a1a90-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004471 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004482 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c2vn\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-kube-api-access-8c2vn\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.004496 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.025979 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d3e3699c-e19d-4c38-b763-32af874a1a90-pod-info" (OuterVolumeSpecName: "pod-info") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.062529 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-server-conf" (OuterVolumeSpecName: "server-conf") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.092120 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.100040 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "16df8292-9780-4212-a920-bf0eed95da87" (UID: "16df8292-9780-4212-a920-bf0eed95da87"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.106944 4574 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e3699c-e19d-4c38-b763-32af874a1a90-pod-info\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.106982 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.106993 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16df8292-9780-4212-a920-bf0eed95da87-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.107038 4574 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e3699c-e19d-4c38-b763-32af874a1a90-server-conf\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.155444 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d3e3699c-e19d-4c38-b763-32af874a1a90" (UID: "d3e3699c-e19d-4c38-b763-32af874a1a90"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.208494 4574 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e3699c-e19d-4c38-b763-32af874a1a90-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.545080 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.545906 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.546025 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e3699c-e19d-4c38-b763-32af874a1a90","Type":"ContainerDied","Data":"7ae442e3be29bb3902342d6eb938cefa74befaf7e9e43c5bc607bf61cb8f2c98"} Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.546151 4574 scope.go:117] "RemoveContainer" containerID="1278d15b8a81afdd76322b79acff8b815745598554a13cb237126afb3b6e9dd6" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.572749 4574 scope.go:117] "RemoveContainer" containerID="19e755d98857189714271accecdc264d16ad48dbf72fe80113eb003f0a2478ba" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.586884 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.620304 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.637711 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.664720 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.683828 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: E1004 05:09:17.684544 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="setup-container" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.684561 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="setup-container" Oct 04 05:09:17 crc kubenswrapper[4574]: E1004 05:09:17.684585 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="rabbitmq" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.684592 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="rabbitmq" Oct 04 05:09:17 crc kubenswrapper[4574]: E1004 05:09:17.684618 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="setup-container" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.684625 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="setup-container" Oct 04 05:09:17 crc kubenswrapper[4574]: E1004 05:09:17.684644 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="rabbitmq" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.684650 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="rabbitmq" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.684870 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="16df8292-9780-4212-a920-bf0eed95da87" containerName="rabbitmq" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.684886 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" containerName="rabbitmq" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.685936 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.691391 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.691640 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.691771 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.691920 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.692038 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.692167 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-s8hzq" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.692334 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.696282 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.699047 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.703642 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.703862 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.704483 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mwpqk" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.704607 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.704730 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.704734 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.704893 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.723895 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.723963 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-config-data\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724008 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt24m\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-kube-api-access-qt24m\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724036 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724073 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724109 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724146 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724177 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724203 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724277 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.724310 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.731149 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.743559 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.825947 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.825999 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826041 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826177 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826225 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826293 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826328 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826439 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826503 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826560 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826588 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826633 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826694 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826740 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826784 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpfsq\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-kube-api-access-qpfsq\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826819 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-config-data\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826844 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826899 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt24m\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-kube-api-access-qt24m\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826936 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826959 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.826990 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.827024 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.827723 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.828321 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.828692 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.829413 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-config-data\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.829579 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.830251 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.831767 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.831972 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.832615 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.835311 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.853396 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt24m\" (UniqueName: \"kubernetes.io/projected/1298ffd0-9c09-4f29-b8bf-eaff9018fcb4-kube-api-access-qt24m\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.865624 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4\") " pod="openstack/rabbitmq-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929200 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929277 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpfsq\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-kube-api-access-qpfsq\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929305 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929346 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929367 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929403 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929453 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929468 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929507 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929533 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.929558 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.930028 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.930142 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.932455 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.932875 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.933147 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.933665 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.934207 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.934321 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.939906 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.942965 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.949671 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpfsq\" (UniqueName: \"kubernetes.io/projected/bbad2653-45e8-4eb2-b7f8-60e6dcee36f2-kube-api-access-qpfsq\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:17 crc kubenswrapper[4574]: I1004 05:09:17.997576 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:18 crc kubenswrapper[4574]: I1004 05:09:18.042880 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 04 05:09:18 crc kubenswrapper[4574]: I1004 05:09:18.064321 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:18 crc kubenswrapper[4574]: I1004 05:09:18.603033 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 04 05:09:18 crc kubenswrapper[4574]: W1004 05:09:18.621700 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1298ffd0_9c09_4f29_b8bf_eaff9018fcb4.slice/crio-ac45bd2f1d02cceb0f55233a0a6fc1d78915e7e825f0fccbeb53ee9d3af291e8 WatchSource:0}: Error finding container ac45bd2f1d02cceb0f55233a0a6fc1d78915e7e825f0fccbeb53ee9d3af291e8: Status 404 returned error can't find the container with id ac45bd2f1d02cceb0f55233a0a6fc1d78915e7e825f0fccbeb53ee9d3af291e8 Oct 04 05:09:18 crc kubenswrapper[4574]: I1004 05:09:18.749874 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16df8292-9780-4212-a920-bf0eed95da87" path="/var/lib/kubelet/pods/16df8292-9780-4212-a920-bf0eed95da87/volumes" Oct 04 05:09:18 crc kubenswrapper[4574]: I1004 05:09:18.751064 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e3699c-e19d-4c38-b763-32af874a1a90" path="/var/lib/kubelet/pods/d3e3699c-e19d-4c38-b763-32af874a1a90/volumes" Oct 04 05:09:18 crc kubenswrapper[4574]: I1004 05:09:18.751809 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 04 05:09:18 crc kubenswrapper[4574]: W1004 05:09:18.753933 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbad2653_45e8_4eb2_b7f8_60e6dcee36f2.slice/crio-15c613312200dcbdaa93d9cc331abdc87af1ef561f5d651a7d8f8525a1a83c3b WatchSource:0}: Error finding container 15c613312200dcbdaa93d9cc331abdc87af1ef561f5d651a7d8f8525a1a83c3b: Status 404 returned error can't find the container with id 15c613312200dcbdaa93d9cc331abdc87af1ef561f5d651a7d8f8525a1a83c3b Oct 04 05:09:19 crc kubenswrapper[4574]: I1004 05:09:19.565621 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4","Type":"ContainerStarted","Data":"ac45bd2f1d02cceb0f55233a0a6fc1d78915e7e825f0fccbeb53ee9d3af291e8"} Oct 04 05:09:19 crc kubenswrapper[4574]: I1004 05:09:19.567649 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2","Type":"ContainerStarted","Data":"15c613312200dcbdaa93d9cc331abdc87af1ef561f5d651a7d8f8525a1a83c3b"} Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.367741 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jzrx2"] Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.369645 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.371585 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.380790 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.380897 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hsp\" (UniqueName: \"kubernetes.io/projected/53ceefbb-4311-41e7-9e96-6c01e293c29f-kube-api-access-k8hsp\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.380962 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-config\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.380996 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.381047 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.381117 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.381148 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.402923 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jzrx2"] Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.481987 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hsp\" (UniqueName: \"kubernetes.io/projected/53ceefbb-4311-41e7-9e96-6c01e293c29f-kube-api-access-k8hsp\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.482057 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-config\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.482081 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.482115 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.482170 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.482188 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.482208 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.483146 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.483292 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.483434 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.483517 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-config\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.483547 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.483745 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.509584 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hsp\" (UniqueName: \"kubernetes.io/projected/53ceefbb-4311-41e7-9e96-6c01e293c29f-kube-api-access-k8hsp\") pod \"dnsmasq-dns-67b789f86c-jzrx2\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.577165 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4","Type":"ContainerStarted","Data":"704417d327ce8c595975804e1d855d1b3e6b2d77498a00e2da47d7ca801d5b7e"} Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.578428 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2","Type":"ContainerStarted","Data":"ec08a255f5efc39a7363c16558a36894e8b09c32db10c36f879557be5b668e4d"} Oct 04 05:09:20 crc kubenswrapper[4574]: I1004 05:09:20.698254 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:21 crc kubenswrapper[4574]: I1004 05:09:21.032487 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jzrx2"] Oct 04 05:09:21 crc kubenswrapper[4574]: W1004 05:09:21.035991 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53ceefbb_4311_41e7_9e96_6c01e293c29f.slice/crio-5c9a0976b3a9d84ebb22ad3dab43910546956f22bac7b9bc7dd520f9b99c1ee4 WatchSource:0}: Error finding container 5c9a0976b3a9d84ebb22ad3dab43910546956f22bac7b9bc7dd520f9b99c1ee4: Status 404 returned error can't find the container with id 5c9a0976b3a9d84ebb22ad3dab43910546956f22bac7b9bc7dd520f9b99c1ee4 Oct 04 05:09:21 crc kubenswrapper[4574]: I1004 05:09:21.413162 4574 scope.go:117] "RemoveContainer" containerID="2428074d47972d1f6fdd6c280ab98af22b8aa63b1019d3e79680f303071f5225" Oct 04 05:09:21 crc kubenswrapper[4574]: I1004 05:09:21.587906 4574 generic.go:334] "Generic (PLEG): container finished" podID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerID="56ef2dbcc6181687c898b4065366934cdada9394b8afc945864a782aa07f82fc" exitCode=0 Oct 04 05:09:21 crc kubenswrapper[4574]: I1004 05:09:21.587995 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" event={"ID":"53ceefbb-4311-41e7-9e96-6c01e293c29f","Type":"ContainerDied","Data":"56ef2dbcc6181687c898b4065366934cdada9394b8afc945864a782aa07f82fc"} Oct 04 05:09:21 crc kubenswrapper[4574]: I1004 05:09:21.588039 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" event={"ID":"53ceefbb-4311-41e7-9e96-6c01e293c29f","Type":"ContainerStarted","Data":"5c9a0976b3a9d84ebb22ad3dab43910546956f22bac7b9bc7dd520f9b99c1ee4"} Oct 04 05:09:22 crc kubenswrapper[4574]: I1004 05:09:22.624286 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" event={"ID":"53ceefbb-4311-41e7-9e96-6c01e293c29f","Type":"ContainerStarted","Data":"906cbf17697529ba8979f9983afd83b0c07ac36e85d22fe483de5e33dedd18d9"} Oct 04 05:09:22 crc kubenswrapper[4574]: I1004 05:09:22.626390 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:22 crc kubenswrapper[4574]: I1004 05:09:22.655483 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" podStartSLOduration=2.655461366 podStartE2EDuration="2.655461366s" podCreationTimestamp="2025-10-04 05:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:09:22.646701663 +0000 UTC m=+1388.500844775" watchObservedRunningTime="2025-10-04 05:09:22.655461366 +0000 UTC m=+1388.509604408" Oct 04 05:09:30 crc kubenswrapper[4574]: I1004 05:09:30.700341 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:30 crc kubenswrapper[4574]: I1004 05:09:30.784158 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fbzln"] Oct 04 05:09:30 crc kubenswrapper[4574]: I1004 05:09:30.989521 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-44rq7"] Oct 04 05:09:30 crc kubenswrapper[4574]: I1004 05:09:30.991400 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.010218 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-44rq7"] Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115031 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8dcf\" (UniqueName: \"kubernetes.io/projected/3f536491-9237-4de1-b43d-2ffefcf26eb8-kube-api-access-r8dcf\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115332 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115456 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115598 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115688 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115724 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-config\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.115745 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.217879 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.218145 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.218342 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.218884 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.218883 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.219058 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-openstack-edpm-ipam\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.219125 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-config\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.219586 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.220465 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8dcf\" (UniqueName: \"kubernetes.io/projected/3f536491-9237-4de1-b43d-2ffefcf26eb8-kube-api-access-r8dcf\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.220849 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.219827 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-config\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.220309 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.221491 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f536491-9237-4de1-b43d-2ffefcf26eb8-dns-svc\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.243174 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8dcf\" (UniqueName: \"kubernetes.io/projected/3f536491-9237-4de1-b43d-2ffefcf26eb8-kube-api-access-r8dcf\") pod \"dnsmasq-dns-7fd9f947b7-44rq7\" (UID: \"3f536491-9237-4de1-b43d-2ffefcf26eb8\") " pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.311480 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.702942 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerName="dnsmasq-dns" containerID="cri-o://4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae" gracePeriod=10 Oct 04 05:09:31 crc kubenswrapper[4574]: I1004 05:09:31.810938 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd9f947b7-44rq7"] Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.114685 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.248004 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-svc\") pod \"d04c0b72-4e8a-40e4-a654-fed2b063729d\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.248501 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn4nw\" (UniqueName: \"kubernetes.io/projected/d04c0b72-4e8a-40e4-a654-fed2b063729d-kube-api-access-dn4nw\") pod \"d04c0b72-4e8a-40e4-a654-fed2b063729d\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.248622 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-sb\") pod \"d04c0b72-4e8a-40e4-a654-fed2b063729d\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.248647 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-nb\") pod \"d04c0b72-4e8a-40e4-a654-fed2b063729d\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.248757 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-config\") pod \"d04c0b72-4e8a-40e4-a654-fed2b063729d\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.248800 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-swift-storage-0\") pod \"d04c0b72-4e8a-40e4-a654-fed2b063729d\" (UID: \"d04c0b72-4e8a-40e4-a654-fed2b063729d\") " Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.264612 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04c0b72-4e8a-40e4-a654-fed2b063729d-kube-api-access-dn4nw" (OuterVolumeSpecName: "kube-api-access-dn4nw") pod "d04c0b72-4e8a-40e4-a654-fed2b063729d" (UID: "d04c0b72-4e8a-40e4-a654-fed2b063729d"). InnerVolumeSpecName "kube-api-access-dn4nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.332823 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-config" (OuterVolumeSpecName: "config") pod "d04c0b72-4e8a-40e4-a654-fed2b063729d" (UID: "d04c0b72-4e8a-40e4-a654-fed2b063729d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.340445 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d04c0b72-4e8a-40e4-a654-fed2b063729d" (UID: "d04c0b72-4e8a-40e4-a654-fed2b063729d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.345136 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d04c0b72-4e8a-40e4-a654-fed2b063729d" (UID: "d04c0b72-4e8a-40e4-a654-fed2b063729d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.352312 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn4nw\" (UniqueName: \"kubernetes.io/projected/d04c0b72-4e8a-40e4-a654-fed2b063729d-kube-api-access-dn4nw\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.352355 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.352371 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.352382 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.353517 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d04c0b72-4e8a-40e4-a654-fed2b063729d" (UID: "d04c0b72-4e8a-40e4-a654-fed2b063729d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.384332 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d04c0b72-4e8a-40e4-a654-fed2b063729d" (UID: "d04c0b72-4e8a-40e4-a654-fed2b063729d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.454682 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.454726 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d04c0b72-4e8a-40e4-a654-fed2b063729d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.716871 4574 generic.go:334] "Generic (PLEG): container finished" podID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerID="4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae" exitCode=0 Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.716953 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" event={"ID":"d04c0b72-4e8a-40e4-a654-fed2b063729d","Type":"ContainerDied","Data":"4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae"} Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.717485 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" event={"ID":"d04c0b72-4e8a-40e4-a654-fed2b063729d","Type":"ContainerDied","Data":"75e9e7a7450259c1a669d8224b88dbf4fe629f5375a3919f4f4d0c151ec38987"} Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.716979 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.717541 4574 scope.go:117] "RemoveContainer" containerID="4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.721645 4574 generic.go:334] "Generic (PLEG): container finished" podID="3f536491-9237-4de1-b43d-2ffefcf26eb8" containerID="bbd64f2c4a1ce69cc71d101913af80f2a1ddd97f442ebcae813200e02290d012" exitCode=0 Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.721686 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" event={"ID":"3f536491-9237-4de1-b43d-2ffefcf26eb8","Type":"ContainerDied","Data":"bbd64f2c4a1ce69cc71d101913af80f2a1ddd97f442ebcae813200e02290d012"} Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.721705 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" event={"ID":"3f536491-9237-4de1-b43d-2ffefcf26eb8","Type":"ContainerStarted","Data":"433f6dbaf6609e7856abf63c0690ce84e498e38fe6451e3450db15ddc0ff05ce"} Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.759489 4574 scope.go:117] "RemoveContainer" containerID="ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.807873 4574 scope.go:117] "RemoveContainer" containerID="4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae" Oct 04 05:09:32 crc kubenswrapper[4574]: E1004 05:09:32.808206 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae\": container with ID starting with 4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae not found: ID does not exist" containerID="4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.808333 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae"} err="failed to get container status \"4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae\": rpc error: code = NotFound desc = could not find container \"4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae\": container with ID starting with 4dcd8788bb601e61defc132812fe09d7ec50a5d1757e430e3719ad81e0ea6fae not found: ID does not exist" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.808461 4574 scope.go:117] "RemoveContainer" containerID="ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e" Oct 04 05:09:32 crc kubenswrapper[4574]: E1004 05:09:32.809060 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e\": container with ID starting with ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e not found: ID does not exist" containerID="ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e" Oct 04 05:09:32 crc kubenswrapper[4574]: I1004 05:09:32.809079 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e"} err="failed to get container status \"ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e\": rpc error: code = NotFound desc = could not find container \"ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e\": container with ID starting with ba88c1e9c5224106fcb3a26588d42d6dfe2d3b92a88987e5d2b37c83cdcc539e not found: ID does not exist" Oct 04 05:09:33 crc kubenswrapper[4574]: I1004 05:09:33.733607 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" event={"ID":"3f536491-9237-4de1-b43d-2ffefcf26eb8","Type":"ContainerStarted","Data":"9462046a01bf5400f0e85af78a5990b20b8714c1927c25d97b26ad5ae6b8d7c4"} Oct 04 05:09:33 crc kubenswrapper[4574]: I1004 05:09:33.734080 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:33 crc kubenswrapper[4574]: I1004 05:09:33.756813 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" podStartSLOduration=3.756780856 podStartE2EDuration="3.756780856s" podCreationTimestamp="2025-10-04 05:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:09:33.750729701 +0000 UTC m=+1399.604872743" watchObservedRunningTime="2025-10-04 05:09:33.756780856 +0000 UTC m=+1399.610923898" Oct 04 05:09:41 crc kubenswrapper[4574]: I1004 05:09:41.312434 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd9f947b7-44rq7" Oct 04 05:09:41 crc kubenswrapper[4574]: I1004 05:09:41.375098 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jzrx2"] Oct 04 05:09:41 crc kubenswrapper[4574]: I1004 05:09:41.375723 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerName="dnsmasq-dns" containerID="cri-o://906cbf17697529ba8979f9983afd83b0c07ac36e85d22fe483de5e33dedd18d9" gracePeriod=10 Oct 04 05:09:41 crc kubenswrapper[4574]: I1004 05:09:41.832399 4574 generic.go:334] "Generic (PLEG): container finished" podID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerID="906cbf17697529ba8979f9983afd83b0c07ac36e85d22fe483de5e33dedd18d9" exitCode=0 Oct 04 05:09:41 crc kubenswrapper[4574]: I1004 05:09:41.832745 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" event={"ID":"53ceefbb-4311-41e7-9e96-6c01e293c29f","Type":"ContainerDied","Data":"906cbf17697529ba8979f9983afd83b0c07ac36e85d22fe483de5e33dedd18d9"} Oct 04 05:09:41 crc kubenswrapper[4574]: I1004 05:09:41.926561 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.041936 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-nb\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.042018 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-svc\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.042129 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-swift-storage-0\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.042149 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-openstack-edpm-ipam\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.042201 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-sb\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.042342 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8hsp\" (UniqueName: \"kubernetes.io/projected/53ceefbb-4311-41e7-9e96-6c01e293c29f-kube-api-access-k8hsp\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.042433 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-config\") pod \"53ceefbb-4311-41e7-9e96-6c01e293c29f\" (UID: \"53ceefbb-4311-41e7-9e96-6c01e293c29f\") " Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.073777 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ceefbb-4311-41e7-9e96-6c01e293c29f-kube-api-access-k8hsp" (OuterVolumeSpecName: "kube-api-access-k8hsp") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "kube-api-access-k8hsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.104921 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.118803 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-config" (OuterVolumeSpecName: "config") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.118826 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.119796 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.122098 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.138569 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "53ceefbb-4311-41e7-9e96-6c01e293c29f" (UID: "53ceefbb-4311-41e7-9e96-6c01e293c29f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145608 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8hsp\" (UniqueName: \"kubernetes.io/projected/53ceefbb-4311-41e7-9e96-6c01e293c29f-kube-api-access-k8hsp\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145651 4574 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145665 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145679 4574 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145690 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145702 4574 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.145716 4574 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53ceefbb-4311-41e7-9e96-6c01e293c29f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.842932 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" event={"ID":"53ceefbb-4311-41e7-9e96-6c01e293c29f","Type":"ContainerDied","Data":"5c9a0976b3a9d84ebb22ad3dab43910546956f22bac7b9bc7dd520f9b99c1ee4"} Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.842977 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jzrx2" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.843189 4574 scope.go:117] "RemoveContainer" containerID="906cbf17697529ba8979f9983afd83b0c07ac36e85d22fe483de5e33dedd18d9" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.871604 4574 scope.go:117] "RemoveContainer" containerID="56ef2dbcc6181687c898b4065366934cdada9394b8afc945864a782aa07f82fc" Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.874486 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jzrx2"] Oct 04 05:09:42 crc kubenswrapper[4574]: I1004 05:09:42.883059 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jzrx2"] Oct 04 05:09:44 crc kubenswrapper[4574]: I1004 05:09:44.753915 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" path="/var/lib/kubelet/pods/53ceefbb-4311-41e7-9e96-6c01e293c29f/volumes" Oct 04 05:09:52 crc kubenswrapper[4574]: I1004 05:09:52.942864 4574 generic.go:334] "Generic (PLEG): container finished" podID="bbad2653-45e8-4eb2-b7f8-60e6dcee36f2" containerID="ec08a255f5efc39a7363c16558a36894e8b09c32db10c36f879557be5b668e4d" exitCode=0 Oct 04 05:09:52 crc kubenswrapper[4574]: I1004 05:09:52.942970 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2","Type":"ContainerDied","Data":"ec08a255f5efc39a7363c16558a36894e8b09c32db10c36f879557be5b668e4d"} Oct 04 05:09:52 crc kubenswrapper[4574]: I1004 05:09:52.952985 4574 generic.go:334] "Generic (PLEG): container finished" podID="1298ffd0-9c09-4f29-b8bf-eaff9018fcb4" containerID="704417d327ce8c595975804e1d855d1b3e6b2d77498a00e2da47d7ca801d5b7e" exitCode=0 Oct 04 05:09:52 crc kubenswrapper[4574]: I1004 05:09:52.953026 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4","Type":"ContainerDied","Data":"704417d327ce8c595975804e1d855d1b3e6b2d77498a00e2da47d7ca801d5b7e"} Oct 04 05:09:53 crc kubenswrapper[4574]: I1004 05:09:53.963972 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bbad2653-45e8-4eb2-b7f8-60e6dcee36f2","Type":"ContainerStarted","Data":"d676c1ea41ed425845d454240536629c9e438b0b7a9c8fdf85ab2e8b86af7829"} Oct 04 05:09:53 crc kubenswrapper[4574]: I1004 05:09:53.965417 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:09:53 crc kubenswrapper[4574]: I1004 05:09:53.967911 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1298ffd0-9c09-4f29-b8bf-eaff9018fcb4","Type":"ContainerStarted","Data":"49e7d520b7b63b075a9a6d1028cb59c6443474c454ca804357b7a6ceacc1366f"} Oct 04 05:09:53 crc kubenswrapper[4574]: I1004 05:09:53.968099 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 04 05:09:54 crc kubenswrapper[4574]: I1004 05:09:54.038137 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.038119103 podStartE2EDuration="37.038119103s" podCreationTimestamp="2025-10-04 05:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:09:54.032691456 +0000 UTC m=+1419.886834498" watchObservedRunningTime="2025-10-04 05:09:54.038119103 +0000 UTC m=+1419.892262145" Oct 04 05:09:54 crc kubenswrapper[4574]: I1004 05:09:54.043468 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.043451407 podStartE2EDuration="37.043451407s" podCreationTimestamp="2025-10-04 05:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:09:54.004687016 +0000 UTC m=+1419.858830078" watchObservedRunningTime="2025-10-04 05:09:54.043451407 +0000 UTC m=+1419.897594449" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.547174 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf"] Oct 04 05:09:59 crc kubenswrapper[4574]: E1004 05:09:59.548225 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerName="init" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.548266 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerName="init" Oct 04 05:09:59 crc kubenswrapper[4574]: E1004 05:09:59.548287 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerName="init" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.548295 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerName="init" Oct 04 05:09:59 crc kubenswrapper[4574]: E1004 05:09:59.548320 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerName="dnsmasq-dns" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.548328 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerName="dnsmasq-dns" Oct 04 05:09:59 crc kubenswrapper[4574]: E1004 05:09:59.548356 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerName="dnsmasq-dns" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.548364 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerName="dnsmasq-dns" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.548602 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ceefbb-4311-41e7-9e96-6c01e293c29f" containerName="dnsmasq-dns" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.548619 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" containerName="dnsmasq-dns" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.549304 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.552103 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.552392 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.552539 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.552641 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.565801 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf"] Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.708440 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jt4d\" (UniqueName: \"kubernetes.io/projected/f0a5e204-886d-416f-96ad-46cc7715e417-kube-api-access-8jt4d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.708511 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.708601 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.708640 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.809814 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jt4d\" (UniqueName: \"kubernetes.io/projected/f0a5e204-886d-416f-96ad-46cc7715e417-kube-api-access-8jt4d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.809874 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.809928 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.809964 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.815169 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.816144 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.816680 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.828972 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jt4d\" (UniqueName: \"kubernetes.io/projected/f0a5e204-886d-416f-96ad-46cc7715e417-kube-api-access-8jt4d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:09:59 crc kubenswrapper[4574]: I1004 05:09:59.876097 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:10:00 crc kubenswrapper[4574]: I1004 05:10:00.534064 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf"] Oct 04 05:10:01 crc kubenswrapper[4574]: I1004 05:10:01.027187 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" event={"ID":"f0a5e204-886d-416f-96ad-46cc7715e417","Type":"ContainerStarted","Data":"0045bbc97d8dfcf56098bfc2dd0e5b39aa2083442c0ae4e72fab4d71bb52502d"} Oct 04 05:10:02 crc kubenswrapper[4574]: I1004 05:10:02.786968 4574 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd04c0b72-4e8a-40e4-a654-fed2b063729d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd04c0b72-4e8a-40e4-a654-fed2b063729d] : Timed out while waiting for systemd to remove kubepods-besteffort-podd04c0b72_4e8a_40e4_a654_fed2b063729d.slice" Oct 04 05:10:02 crc kubenswrapper[4574]: E1004 05:10:02.787301 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podd04c0b72-4e8a-40e4-a654-fed2b063729d] : unable to destroy cgroup paths for cgroup [kubepods besteffort podd04c0b72-4e8a-40e4-a654-fed2b063729d] : Timed out while waiting for systemd to remove kubepods-besteffort-podd04c0b72_4e8a_40e4_a654_fed2b063729d.slice" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" Oct 04 05:10:03 crc kubenswrapper[4574]: I1004 05:10:03.049372 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-fbzln" Oct 04 05:10:03 crc kubenswrapper[4574]: I1004 05:10:03.084093 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fbzln"] Oct 04 05:10:03 crc kubenswrapper[4574]: I1004 05:10:03.098223 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-fbzln"] Oct 04 05:10:04 crc kubenswrapper[4574]: I1004 05:10:04.747712 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04c0b72-4e8a-40e4-a654-fed2b063729d" path="/var/lib/kubelet/pods/d04c0b72-4e8a-40e4-a654-fed2b063729d/volumes" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.551454 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k8zkp"] Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.554195 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.562894 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8zkp"] Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.651380 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-utilities\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.651726 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9628g\" (UniqueName: \"kubernetes.io/projected/813082f8-5c17-4b5f-b0f6-f1f956efd469-kube-api-access-9628g\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.651764 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-catalog-content\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.753352 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-catalog-content\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.753847 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-catalog-content\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.754659 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-utilities\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.754973 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-utilities\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.755364 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9628g\" (UniqueName: \"kubernetes.io/projected/813082f8-5c17-4b5f-b0f6-f1f956efd469-kube-api-access-9628g\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.776695 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9628g\" (UniqueName: \"kubernetes.io/projected/813082f8-5c17-4b5f-b0f6-f1f956efd469-kube-api-access-9628g\") pod \"certified-operators-k8zkp\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:06 crc kubenswrapper[4574]: I1004 05:10:06.896890 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:08 crc kubenswrapper[4574]: I1004 05:10:08.047700 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 04 05:10:08 crc kubenswrapper[4574]: I1004 05:10:08.070382 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 04 05:10:10 crc kubenswrapper[4574]: I1004 05:10:10.611081 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8zkp"] Oct 04 05:10:10 crc kubenswrapper[4574]: W1004 05:10:10.613758 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod813082f8_5c17_4b5f_b0f6_f1f956efd469.slice/crio-304f7bddcb5fe41e0e4aaee75e47654d4ea51236939394fb1ee51295d69be2e2 WatchSource:0}: Error finding container 304f7bddcb5fe41e0e4aaee75e47654d4ea51236939394fb1ee51295d69be2e2: Status 404 returned error can't find the container with id 304f7bddcb5fe41e0e4aaee75e47654d4ea51236939394fb1ee51295d69be2e2 Oct 04 05:10:11 crc kubenswrapper[4574]: I1004 05:10:11.147574 4574 generic.go:334] "Generic (PLEG): container finished" podID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerID="84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0" exitCode=0 Oct 04 05:10:11 crc kubenswrapper[4574]: I1004 05:10:11.147683 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerDied","Data":"84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0"} Oct 04 05:10:11 crc kubenswrapper[4574]: I1004 05:10:11.148167 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerStarted","Data":"304f7bddcb5fe41e0e4aaee75e47654d4ea51236939394fb1ee51295d69be2e2"} Oct 04 05:10:11 crc kubenswrapper[4574]: I1004 05:10:11.150151 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" event={"ID":"f0a5e204-886d-416f-96ad-46cc7715e417","Type":"ContainerStarted","Data":"bd034cd54562fbb430be0acf242d27272c705fd1264b6376d0f5ab468121fef7"} Oct 04 05:10:11 crc kubenswrapper[4574]: I1004 05:10:11.191909 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" podStartSLOduration=2.4967715679999998 podStartE2EDuration="12.191889018s" podCreationTimestamp="2025-10-04 05:09:59 +0000 UTC" firstStartedPulling="2025-10-04 05:10:00.550445032 +0000 UTC m=+1426.404588074" lastFinishedPulling="2025-10-04 05:10:10.245562482 +0000 UTC m=+1436.099705524" observedRunningTime="2025-10-04 05:10:11.182323961 +0000 UTC m=+1437.036467003" watchObservedRunningTime="2025-10-04 05:10:11.191889018 +0000 UTC m=+1437.046032050" Oct 04 05:10:12 crc kubenswrapper[4574]: I1004 05:10:12.163782 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerStarted","Data":"c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7"} Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.310471 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l8bs4"] Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.314812 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.322172 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l8bs4"] Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.510044 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jw4w\" (UniqueName: \"kubernetes.io/projected/30b6fbc3-1578-424c-8b33-a22582f46051-kube-api-access-9jw4w\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.510329 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-utilities\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.510494 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-catalog-content\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.612533 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jw4w\" (UniqueName: \"kubernetes.io/projected/30b6fbc3-1578-424c-8b33-a22582f46051-kube-api-access-9jw4w\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.612578 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-utilities\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.612671 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-catalog-content\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.613125 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-utilities\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.613196 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-catalog-content\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.634278 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jw4w\" (UniqueName: \"kubernetes.io/projected/30b6fbc3-1578-424c-8b33-a22582f46051-kube-api-access-9jw4w\") pod \"redhat-operators-l8bs4\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:13 crc kubenswrapper[4574]: I1004 05:10:13.649147 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:14 crc kubenswrapper[4574]: I1004 05:10:14.141963 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l8bs4"] Oct 04 05:10:14 crc kubenswrapper[4574]: I1004 05:10:14.181804 4574 generic.go:334] "Generic (PLEG): container finished" podID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerID="c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7" exitCode=0 Oct 04 05:10:14 crc kubenswrapper[4574]: I1004 05:10:14.181873 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerDied","Data":"c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7"} Oct 04 05:10:14 crc kubenswrapper[4574]: I1004 05:10:14.185067 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerStarted","Data":"e121425457c0ab688a98d690380306fb42ffa95c6bcbdedb254a06ee1aabea03"} Oct 04 05:10:15 crc kubenswrapper[4574]: I1004 05:10:15.197143 4574 generic.go:334] "Generic (PLEG): container finished" podID="30b6fbc3-1578-424c-8b33-a22582f46051" containerID="eaf6623c5182f9f6e307515d17e104ca2bb91e0fd4a5f356632ce9128ab8ed85" exitCode=0 Oct 04 05:10:15 crc kubenswrapper[4574]: I1004 05:10:15.197497 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerDied","Data":"eaf6623c5182f9f6e307515d17e104ca2bb91e0fd4a5f356632ce9128ab8ed85"} Oct 04 05:10:15 crc kubenswrapper[4574]: I1004 05:10:15.207455 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerStarted","Data":"6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57"} Oct 04 05:10:16 crc kubenswrapper[4574]: I1004 05:10:16.897176 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:16 crc kubenswrapper[4574]: I1004 05:10:16.897763 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:17 crc kubenswrapper[4574]: I1004 05:10:17.227034 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerStarted","Data":"3a6a6a5e915da3f42e99e63455ff19aa1ede7e3e579a48c30dd3932569394983"} Oct 04 05:10:17 crc kubenswrapper[4574]: I1004 05:10:17.259251 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k8zkp" podStartSLOduration=7.796737872 podStartE2EDuration="11.259184064s" podCreationTimestamp="2025-10-04 05:10:06 +0000 UTC" firstStartedPulling="2025-10-04 05:10:11.149308686 +0000 UTC m=+1437.003451728" lastFinishedPulling="2025-10-04 05:10:14.611754878 +0000 UTC m=+1440.465897920" observedRunningTime="2025-10-04 05:10:15.237586902 +0000 UTC m=+1441.091729954" watchObservedRunningTime="2025-10-04 05:10:17.259184064 +0000 UTC m=+1443.113327106" Oct 04 05:10:17 crc kubenswrapper[4574]: I1004 05:10:17.953958 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k8zkp" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="registry-server" probeResult="failure" output=< Oct 04 05:10:17 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:10:17 crc kubenswrapper[4574]: > Oct 04 05:10:21 crc kubenswrapper[4574]: I1004 05:10:21.541214 4574 scope.go:117] "RemoveContainer" containerID="d5655f18f8668b0b0b32b184f1ce68bc9a08312686a470d75b2f8870edb99e71" Oct 04 05:10:21 crc kubenswrapper[4574]: I1004 05:10:21.569147 4574 scope.go:117] "RemoveContainer" containerID="0388828482fb6b864c1bef5ccd6e44ad60cb0ffe96fef79814984f65deb9ea81" Oct 04 05:10:21 crc kubenswrapper[4574]: I1004 05:10:21.593332 4574 scope.go:117] "RemoveContainer" containerID="76051f35e51e8a4820c47cfcc47ff16bc1456f3d7a405600540c6f19a3b961a8" Oct 04 05:10:21 crc kubenswrapper[4574]: I1004 05:10:21.652241 4574 scope.go:117] "RemoveContainer" containerID="c27c879b5bfc700a69f9da14a6e8697a92671c4ea9d95c59db59f6cb3b3ab5e7" Oct 04 05:10:22 crc kubenswrapper[4574]: I1004 05:10:22.278162 4574 generic.go:334] "Generic (PLEG): container finished" podID="30b6fbc3-1578-424c-8b33-a22582f46051" containerID="3a6a6a5e915da3f42e99e63455ff19aa1ede7e3e579a48c30dd3932569394983" exitCode=0 Oct 04 05:10:22 crc kubenswrapper[4574]: I1004 05:10:22.278212 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerDied","Data":"3a6a6a5e915da3f42e99e63455ff19aa1ede7e3e579a48c30dd3932569394983"} Oct 04 05:10:23 crc kubenswrapper[4574]: I1004 05:10:23.290184 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerStarted","Data":"f3e1c2d4f00cf884966f3a30e812683119ecf011c56c0ca8d46265dfed225282"} Oct 04 05:10:23 crc kubenswrapper[4574]: I1004 05:10:23.320394 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l8bs4" podStartSLOduration=2.859292568 podStartE2EDuration="10.320375431s" podCreationTimestamp="2025-10-04 05:10:13 +0000 UTC" firstStartedPulling="2025-10-04 05:10:15.20187903 +0000 UTC m=+1441.056022072" lastFinishedPulling="2025-10-04 05:10:22.662961893 +0000 UTC m=+1448.517104935" observedRunningTime="2025-10-04 05:10:23.309075454 +0000 UTC m=+1449.163218506" watchObservedRunningTime="2025-10-04 05:10:23.320375431 +0000 UTC m=+1449.174518473" Oct 04 05:10:23 crc kubenswrapper[4574]: I1004 05:10:23.649651 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:23 crc kubenswrapper[4574]: I1004 05:10:23.649983 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:10:24 crc kubenswrapper[4574]: I1004 05:10:24.301292 4574 generic.go:334] "Generic (PLEG): container finished" podID="f0a5e204-886d-416f-96ad-46cc7715e417" containerID="bd034cd54562fbb430be0acf242d27272c705fd1264b6376d0f5ab468121fef7" exitCode=0 Oct 04 05:10:24 crc kubenswrapper[4574]: I1004 05:10:24.301793 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" event={"ID":"f0a5e204-886d-416f-96ad-46cc7715e417","Type":"ContainerDied","Data":"bd034cd54562fbb430be0acf242d27272c705fd1264b6376d0f5ab468121fef7"} Oct 04 05:10:24 crc kubenswrapper[4574]: I1004 05:10:24.706549 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" probeResult="failure" output=< Oct 04 05:10:24 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:10:24 crc kubenswrapper[4574]: > Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.751351 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.890326 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jt4d\" (UniqueName: \"kubernetes.io/projected/f0a5e204-886d-416f-96ad-46cc7715e417-kube-api-access-8jt4d\") pod \"f0a5e204-886d-416f-96ad-46cc7715e417\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.890422 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-repo-setup-combined-ca-bundle\") pod \"f0a5e204-886d-416f-96ad-46cc7715e417\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.890462 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-inventory\") pod \"f0a5e204-886d-416f-96ad-46cc7715e417\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.890617 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-ssh-key\") pod \"f0a5e204-886d-416f-96ad-46cc7715e417\" (UID: \"f0a5e204-886d-416f-96ad-46cc7715e417\") " Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.897576 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f0a5e204-886d-416f-96ad-46cc7715e417" (UID: "f0a5e204-886d-416f-96ad-46cc7715e417"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.897781 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a5e204-886d-416f-96ad-46cc7715e417-kube-api-access-8jt4d" (OuterVolumeSpecName: "kube-api-access-8jt4d") pod "f0a5e204-886d-416f-96ad-46cc7715e417" (UID: "f0a5e204-886d-416f-96ad-46cc7715e417"). InnerVolumeSpecName "kube-api-access-8jt4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.925223 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f0a5e204-886d-416f-96ad-46cc7715e417" (UID: "f0a5e204-886d-416f-96ad-46cc7715e417"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.930494 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-inventory" (OuterVolumeSpecName: "inventory") pod "f0a5e204-886d-416f-96ad-46cc7715e417" (UID: "f0a5e204-886d-416f-96ad-46cc7715e417"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.992727 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jt4d\" (UniqueName: \"kubernetes.io/projected/f0a5e204-886d-416f-96ad-46cc7715e417-kube-api-access-8jt4d\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.992769 4574 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.992782 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:25 crc kubenswrapper[4574]: I1004 05:10:25.992796 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0a5e204-886d-416f-96ad-46cc7715e417-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.320743 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" event={"ID":"f0a5e204-886d-416f-96ad-46cc7715e417","Type":"ContainerDied","Data":"0045bbc97d8dfcf56098bfc2dd0e5b39aa2083442c0ae4e72fab4d71bb52502d"} Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.320784 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0045bbc97d8dfcf56098bfc2dd0e5b39aa2083442c0ae4e72fab4d71bb52502d" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.320788 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.384939 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r"] Oct 04 05:10:26 crc kubenswrapper[4574]: E1004 05:10:26.385348 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a5e204-886d-416f-96ad-46cc7715e417" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.385384 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a5e204-886d-416f-96ad-46cc7715e417" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.385600 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a5e204-886d-416f-96ad-46cc7715e417" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.386254 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.390093 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.390254 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.390270 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.390380 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.398458 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r"] Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.501855 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l54wb\" (UniqueName: \"kubernetes.io/projected/5ba9a62a-eb41-401f-ac26-779fb50b276a-kube-api-access-l54wb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.501933 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.502003 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.604112 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l54wb\" (UniqueName: \"kubernetes.io/projected/5ba9a62a-eb41-401f-ac26-779fb50b276a-kube-api-access-l54wb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.604208 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.604299 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.607866 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.612775 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.626784 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l54wb\" (UniqueName: \"kubernetes.io/projected/5ba9a62a-eb41-401f-ac26-779fb50b276a-kube-api-access-l54wb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-74r7r\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:26 crc kubenswrapper[4574]: I1004 05:10:26.712655 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:27 crc kubenswrapper[4574]: I1004 05:10:27.260482 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r"] Oct 04 05:10:27 crc kubenswrapper[4574]: W1004 05:10:27.270924 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ba9a62a_eb41_401f_ac26_779fb50b276a.slice/crio-749374536d85280ed92ac9ba7a76b7db3e2ce99c63173c672f05d62dac72f9c9 WatchSource:0}: Error finding container 749374536d85280ed92ac9ba7a76b7db3e2ce99c63173c672f05d62dac72f9c9: Status 404 returned error can't find the container with id 749374536d85280ed92ac9ba7a76b7db3e2ce99c63173c672f05d62dac72f9c9 Oct 04 05:10:27 crc kubenswrapper[4574]: I1004 05:10:27.331486 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" event={"ID":"5ba9a62a-eb41-401f-ac26-779fb50b276a","Type":"ContainerStarted","Data":"749374536d85280ed92ac9ba7a76b7db3e2ce99c63173c672f05d62dac72f9c9"} Oct 04 05:10:27 crc kubenswrapper[4574]: I1004 05:10:27.947731 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k8zkp" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="registry-server" probeResult="failure" output=< Oct 04 05:10:27 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:10:27 crc kubenswrapper[4574]: > Oct 04 05:10:29 crc kubenswrapper[4574]: I1004 05:10:29.350021 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" event={"ID":"5ba9a62a-eb41-401f-ac26-779fb50b276a","Type":"ContainerStarted","Data":"082d8bc0b1b817338bc615b9452895c720125f7bfa95467889ab7379061e0491"} Oct 04 05:10:30 crc kubenswrapper[4574]: I1004 05:10:30.405758 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" podStartSLOduration=2.6417151580000002 podStartE2EDuration="4.405738257s" podCreationTimestamp="2025-10-04 05:10:26 +0000 UTC" firstStartedPulling="2025-10-04 05:10:27.273791436 +0000 UTC m=+1453.127934478" lastFinishedPulling="2025-10-04 05:10:29.037814535 +0000 UTC m=+1454.891957577" observedRunningTime="2025-10-04 05:10:30.397113427 +0000 UTC m=+1456.251256469" watchObservedRunningTime="2025-10-04 05:10:30.405738257 +0000 UTC m=+1456.259881299" Oct 04 05:10:32 crc kubenswrapper[4574]: I1004 05:10:32.383265 4574 generic.go:334] "Generic (PLEG): container finished" podID="5ba9a62a-eb41-401f-ac26-779fb50b276a" containerID="082d8bc0b1b817338bc615b9452895c720125f7bfa95467889ab7379061e0491" exitCode=0 Oct 04 05:10:32 crc kubenswrapper[4574]: I1004 05:10:32.383348 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" event={"ID":"5ba9a62a-eb41-401f-ac26-779fb50b276a","Type":"ContainerDied","Data":"082d8bc0b1b817338bc615b9452895c720125f7bfa95467889ab7379061e0491"} Oct 04 05:10:33 crc kubenswrapper[4574]: I1004 05:10:33.815509 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:33 crc kubenswrapper[4574]: I1004 05:10:33.968524 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l54wb\" (UniqueName: \"kubernetes.io/projected/5ba9a62a-eb41-401f-ac26-779fb50b276a-kube-api-access-l54wb\") pod \"5ba9a62a-eb41-401f-ac26-779fb50b276a\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " Oct 04 05:10:33 crc kubenswrapper[4574]: I1004 05:10:33.968671 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-ssh-key\") pod \"5ba9a62a-eb41-401f-ac26-779fb50b276a\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " Oct 04 05:10:33 crc kubenswrapper[4574]: I1004 05:10:33.968691 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-inventory\") pod \"5ba9a62a-eb41-401f-ac26-779fb50b276a\" (UID: \"5ba9a62a-eb41-401f-ac26-779fb50b276a\") " Oct 04 05:10:33 crc kubenswrapper[4574]: I1004 05:10:33.978888 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba9a62a-eb41-401f-ac26-779fb50b276a-kube-api-access-l54wb" (OuterVolumeSpecName: "kube-api-access-l54wb") pod "5ba9a62a-eb41-401f-ac26-779fb50b276a" (UID: "5ba9a62a-eb41-401f-ac26-779fb50b276a"). InnerVolumeSpecName "kube-api-access-l54wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.001430 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-inventory" (OuterVolumeSpecName: "inventory") pod "5ba9a62a-eb41-401f-ac26-779fb50b276a" (UID: "5ba9a62a-eb41-401f-ac26-779fb50b276a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.024909 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ba9a62a-eb41-401f-ac26-779fb50b276a" (UID: "5ba9a62a-eb41-401f-ac26-779fb50b276a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.071000 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.071034 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ba9a62a-eb41-401f-ac26-779fb50b276a-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.071044 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l54wb\" (UniqueName: \"kubernetes.io/projected/5ba9a62a-eb41-401f-ac26-779fb50b276a-kube-api-access-l54wb\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.402092 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" event={"ID":"5ba9a62a-eb41-401f-ac26-779fb50b276a","Type":"ContainerDied","Data":"749374536d85280ed92ac9ba7a76b7db3e2ce99c63173c672f05d62dac72f9c9"} Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.402463 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="749374536d85280ed92ac9ba7a76b7db3e2ce99c63173c672f05d62dac72f9c9" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.402152 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-74r7r" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.702587 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" probeResult="failure" output=< Oct 04 05:10:34 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:10:34 crc kubenswrapper[4574]: > Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.910495 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2"] Oct 04 05:10:34 crc kubenswrapper[4574]: E1004 05:10:34.910982 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba9a62a-eb41-401f-ac26-779fb50b276a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.911001 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba9a62a-eb41-401f-ac26-779fb50b276a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.911273 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba9a62a-eb41-401f-ac26-779fb50b276a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.912038 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.913996 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.914328 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.914401 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.914446 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:10:34 crc kubenswrapper[4574]: I1004 05:10:34.947126 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2"] Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.090092 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnw2t\" (UniqueName: \"kubernetes.io/projected/1e9631eb-d051-4087-81eb-2f33ea4dd993-kube-api-access-vnw2t\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.090269 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.090317 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.090385 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.192825 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.193165 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.193318 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnw2t\" (UniqueName: \"kubernetes.io/projected/1e9631eb-d051-4087-81eb-2f33ea4dd993-kube-api-access-vnw2t\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.193911 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.200048 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.200257 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.213679 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.215450 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnw2t\" (UniqueName: \"kubernetes.io/projected/1e9631eb-d051-4087-81eb-2f33ea4dd993-kube-api-access-vnw2t\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.237643 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:10:35 crc kubenswrapper[4574]: I1004 05:10:35.833072 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2"] Oct 04 05:10:36 crc kubenswrapper[4574]: I1004 05:10:36.425689 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" event={"ID":"1e9631eb-d051-4087-81eb-2f33ea4dd993","Type":"ContainerStarted","Data":"5709a64ce9911bda437b8bc31497015a04e00597251178e58df4c5fc25b6e99b"} Oct 04 05:10:36 crc kubenswrapper[4574]: I1004 05:10:36.953065 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:37 crc kubenswrapper[4574]: I1004 05:10:37.021142 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:37 crc kubenswrapper[4574]: I1004 05:10:37.438111 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" event={"ID":"1e9631eb-d051-4087-81eb-2f33ea4dd993","Type":"ContainerStarted","Data":"049c6977c6f13070e724e84d45ce3ca0cf53c2bd6ec62207afe8c1562ee4311b"} Oct 04 05:10:37 crc kubenswrapper[4574]: I1004 05:10:37.463034 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" podStartSLOduration=2.720561082 podStartE2EDuration="3.46301319s" podCreationTimestamp="2025-10-04 05:10:34 +0000 UTC" firstStartedPulling="2025-10-04 05:10:35.844632843 +0000 UTC m=+1461.698775885" lastFinishedPulling="2025-10-04 05:10:36.587084941 +0000 UTC m=+1462.441227993" observedRunningTime="2025-10-04 05:10:37.45575643 +0000 UTC m=+1463.309899472" watchObservedRunningTime="2025-10-04 05:10:37.46301319 +0000 UTC m=+1463.317156232" Oct 04 05:10:37 crc kubenswrapper[4574]: I1004 05:10:37.753330 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8zkp"] Oct 04 05:10:38 crc kubenswrapper[4574]: I1004 05:10:38.445810 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k8zkp" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="registry-server" containerID="cri-o://6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57" gracePeriod=2 Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.043135 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.085143 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-utilities\") pod \"813082f8-5c17-4b5f-b0f6-f1f956efd469\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.085201 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9628g\" (UniqueName: \"kubernetes.io/projected/813082f8-5c17-4b5f-b0f6-f1f956efd469-kube-api-access-9628g\") pod \"813082f8-5c17-4b5f-b0f6-f1f956efd469\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.085258 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-catalog-content\") pod \"813082f8-5c17-4b5f-b0f6-f1f956efd469\" (UID: \"813082f8-5c17-4b5f-b0f6-f1f956efd469\") " Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.086704 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-utilities" (OuterVolumeSpecName: "utilities") pod "813082f8-5c17-4b5f-b0f6-f1f956efd469" (UID: "813082f8-5c17-4b5f-b0f6-f1f956efd469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.106105 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813082f8-5c17-4b5f-b0f6-f1f956efd469-kube-api-access-9628g" (OuterVolumeSpecName: "kube-api-access-9628g") pod "813082f8-5c17-4b5f-b0f6-f1f956efd469" (UID: "813082f8-5c17-4b5f-b0f6-f1f956efd469"). InnerVolumeSpecName "kube-api-access-9628g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.165224 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "813082f8-5c17-4b5f-b0f6-f1f956efd469" (UID: "813082f8-5c17-4b5f-b0f6-f1f956efd469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.188092 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.188149 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9628g\" (UniqueName: \"kubernetes.io/projected/813082f8-5c17-4b5f-b0f6-f1f956efd469-kube-api-access-9628g\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.188160 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813082f8-5c17-4b5f-b0f6-f1f956efd469-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.457297 4574 generic.go:334] "Generic (PLEG): container finished" podID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerID="6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57" exitCode=0 Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.457385 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8zkp" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.457385 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerDied","Data":"6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57"} Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.457785 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8zkp" event={"ID":"813082f8-5c17-4b5f-b0f6-f1f956efd469","Type":"ContainerDied","Data":"304f7bddcb5fe41e0e4aaee75e47654d4ea51236939394fb1ee51295d69be2e2"} Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.457807 4574 scope.go:117] "RemoveContainer" containerID="6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.476945 4574 scope.go:117] "RemoveContainer" containerID="c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.491996 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8zkp"] Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.500349 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k8zkp"] Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.529863 4574 scope.go:117] "RemoveContainer" containerID="84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.555762 4574 scope.go:117] "RemoveContainer" containerID="6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57" Oct 04 05:10:39 crc kubenswrapper[4574]: E1004 05:10:39.557741 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57\": container with ID starting with 6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57 not found: ID does not exist" containerID="6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.557781 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57"} err="failed to get container status \"6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57\": rpc error: code = NotFound desc = could not find container \"6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57\": container with ID starting with 6c3917c4b6187af3f52e3b73960318bd826e6183a878e0efb8aafde1b5f4ef57 not found: ID does not exist" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.557805 4574 scope.go:117] "RemoveContainer" containerID="c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7" Oct 04 05:10:39 crc kubenswrapper[4574]: E1004 05:10:39.558056 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7\": container with ID starting with c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7 not found: ID does not exist" containerID="c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.558078 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7"} err="failed to get container status \"c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7\": rpc error: code = NotFound desc = could not find container \"c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7\": container with ID starting with c3c2aa6a6ebe169233277354a016cf81eafe396857842ab93dd6e5d67ef1d3d7 not found: ID does not exist" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.558094 4574 scope.go:117] "RemoveContainer" containerID="84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0" Oct 04 05:10:39 crc kubenswrapper[4574]: E1004 05:10:39.558332 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0\": container with ID starting with 84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0 not found: ID does not exist" containerID="84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0" Oct 04 05:10:39 crc kubenswrapper[4574]: I1004 05:10:39.558356 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0"} err="failed to get container status \"84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0\": rpc error: code = NotFound desc = could not find container \"84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0\": container with ID starting with 84e81170b03ce525739afb264630264cb6fa1eaccc4631e1a68a2a3ccec8f0b0 not found: ID does not exist" Oct 04 05:10:40 crc kubenswrapper[4574]: I1004 05:10:40.742909 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" path="/var/lib/kubelet/pods/813082f8-5c17-4b5f-b0f6-f1f956efd469/volumes" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.162954 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-968tn"] Oct 04 05:10:42 crc kubenswrapper[4574]: E1004 05:10:42.164039 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="extract-utilities" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.164057 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="extract-utilities" Oct 04 05:10:42 crc kubenswrapper[4574]: E1004 05:10:42.164088 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="extract-content" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.164096 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="extract-content" Oct 04 05:10:42 crc kubenswrapper[4574]: E1004 05:10:42.164110 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="registry-server" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.164118 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="registry-server" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.164405 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="813082f8-5c17-4b5f-b0f6-f1f956efd469" containerName="registry-server" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.166176 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.197848 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-968tn"] Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.341775 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-utilities\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.342716 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-catalog-content\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.342862 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrv4x\" (UniqueName: \"kubernetes.io/projected/568ba34b-27dc-42bb-9d0e-71db48014ae1-kube-api-access-hrv4x\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.445266 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-utilities\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.445688 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-catalog-content\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.445839 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrv4x\" (UniqueName: \"kubernetes.io/projected/568ba34b-27dc-42bb-9d0e-71db48014ae1-kube-api-access-hrv4x\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.446069 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-utilities\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.446175 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-catalog-content\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.468191 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrv4x\" (UniqueName: \"kubernetes.io/projected/568ba34b-27dc-42bb-9d0e-71db48014ae1-kube-api-access-hrv4x\") pod \"redhat-marketplace-968tn\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:42 crc kubenswrapper[4574]: I1004 05:10:42.487526 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:43 crc kubenswrapper[4574]: I1004 05:10:43.026869 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-968tn"] Oct 04 05:10:43 crc kubenswrapper[4574]: I1004 05:10:43.496739 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerStarted","Data":"f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057"} Oct 04 05:10:43 crc kubenswrapper[4574]: I1004 05:10:43.497054 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerStarted","Data":"8dadc4c32caca4106c9b630677076d525a7af3c8057888997ac33ea9c9200405"} Oct 04 05:10:44 crc kubenswrapper[4574]: I1004 05:10:44.505555 4574 generic.go:334] "Generic (PLEG): container finished" podID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerID="f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057" exitCode=0 Oct 04 05:10:44 crc kubenswrapper[4574]: I1004 05:10:44.505642 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerDied","Data":"f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057"} Oct 04 05:10:44 crc kubenswrapper[4574]: I1004 05:10:44.703636 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" probeResult="failure" output=< Oct 04 05:10:44 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:10:44 crc kubenswrapper[4574]: > Oct 04 05:10:46 crc kubenswrapper[4574]: I1004 05:10:46.528182 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerStarted","Data":"1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e"} Oct 04 05:10:47 crc kubenswrapper[4574]: I1004 05:10:47.541038 4574 generic.go:334] "Generic (PLEG): container finished" podID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerID="1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e" exitCode=0 Oct 04 05:10:47 crc kubenswrapper[4574]: I1004 05:10:47.541108 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerDied","Data":"1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e"} Oct 04 05:10:49 crc kubenswrapper[4574]: I1004 05:10:49.404765 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:10:49 crc kubenswrapper[4574]: I1004 05:10:49.405413 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:10:49 crc kubenswrapper[4574]: I1004 05:10:49.563854 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerStarted","Data":"5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718"} Oct 04 05:10:49 crc kubenswrapper[4574]: I1004 05:10:49.592291 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-968tn" podStartSLOduration=3.494303809 podStartE2EDuration="7.592260925s" podCreationTimestamp="2025-10-04 05:10:42 +0000 UTC" firstStartedPulling="2025-10-04 05:10:44.507440162 +0000 UTC m=+1470.361583204" lastFinishedPulling="2025-10-04 05:10:48.605397278 +0000 UTC m=+1474.459540320" observedRunningTime="2025-10-04 05:10:49.582040269 +0000 UTC m=+1475.436183311" watchObservedRunningTime="2025-10-04 05:10:49.592260925 +0000 UTC m=+1475.446403967" Oct 04 05:10:52 crc kubenswrapper[4574]: I1004 05:10:52.488649 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:52 crc kubenswrapper[4574]: I1004 05:10:52.489052 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:52 crc kubenswrapper[4574]: I1004 05:10:52.534816 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:10:54 crc kubenswrapper[4574]: I1004 05:10:54.693687 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" probeResult="failure" output=< Oct 04 05:10:54 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:10:54 crc kubenswrapper[4574]: > Oct 04 05:11:02 crc kubenswrapper[4574]: I1004 05:11:02.549468 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:11:02 crc kubenswrapper[4574]: I1004 05:11:02.610054 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-968tn"] Oct 04 05:11:02 crc kubenswrapper[4574]: I1004 05:11:02.675565 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-968tn" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="registry-server" containerID="cri-o://5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718" gracePeriod=2 Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.154774 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.276116 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-utilities\") pod \"568ba34b-27dc-42bb-9d0e-71db48014ae1\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.276191 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-catalog-content\") pod \"568ba34b-27dc-42bb-9d0e-71db48014ae1\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.276286 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrv4x\" (UniqueName: \"kubernetes.io/projected/568ba34b-27dc-42bb-9d0e-71db48014ae1-kube-api-access-hrv4x\") pod \"568ba34b-27dc-42bb-9d0e-71db48014ae1\" (UID: \"568ba34b-27dc-42bb-9d0e-71db48014ae1\") " Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.277300 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-utilities" (OuterVolumeSpecName: "utilities") pod "568ba34b-27dc-42bb-9d0e-71db48014ae1" (UID: "568ba34b-27dc-42bb-9d0e-71db48014ae1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.284592 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568ba34b-27dc-42bb-9d0e-71db48014ae1-kube-api-access-hrv4x" (OuterVolumeSpecName: "kube-api-access-hrv4x") pod "568ba34b-27dc-42bb-9d0e-71db48014ae1" (UID: "568ba34b-27dc-42bb-9d0e-71db48014ae1"). InnerVolumeSpecName "kube-api-access-hrv4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.291493 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "568ba34b-27dc-42bb-9d0e-71db48014ae1" (UID: "568ba34b-27dc-42bb-9d0e-71db48014ae1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.378157 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrv4x\" (UniqueName: \"kubernetes.io/projected/568ba34b-27dc-42bb-9d0e-71db48014ae1-kube-api-access-hrv4x\") on node \"crc\" DevicePath \"\"" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.378192 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.378207 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568ba34b-27dc-42bb-9d0e-71db48014ae1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.687523 4574 generic.go:334] "Generic (PLEG): container finished" podID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerID="5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718" exitCode=0 Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.687712 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerDied","Data":"5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718"} Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.688626 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-968tn" event={"ID":"568ba34b-27dc-42bb-9d0e-71db48014ae1","Type":"ContainerDied","Data":"8dadc4c32caca4106c9b630677076d525a7af3c8057888997ac33ea9c9200405"} Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.688761 4574 scope.go:117] "RemoveContainer" containerID="5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.687786 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-968tn" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.730995 4574 scope.go:117] "RemoveContainer" containerID="1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.736817 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-968tn"] Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.747421 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-968tn"] Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.759189 4574 scope.go:117] "RemoveContainer" containerID="f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.799727 4574 scope.go:117] "RemoveContainer" containerID="5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718" Oct 04 05:11:03 crc kubenswrapper[4574]: E1004 05:11:03.800322 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718\": container with ID starting with 5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718 not found: ID does not exist" containerID="5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.800354 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718"} err="failed to get container status \"5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718\": rpc error: code = NotFound desc = could not find container \"5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718\": container with ID starting with 5e3f029d121e87f0f577759d0a504d01265e2e06d189581c06739157e4a8a718 not found: ID does not exist" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.800373 4574 scope.go:117] "RemoveContainer" containerID="1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e" Oct 04 05:11:03 crc kubenswrapper[4574]: E1004 05:11:03.800860 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e\": container with ID starting with 1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e not found: ID does not exist" containerID="1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.800907 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e"} err="failed to get container status \"1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e\": rpc error: code = NotFound desc = could not find container \"1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e\": container with ID starting with 1422e5223293bd9bf4d09776c697874d98d98bc8c288a9ef69e2869448c4804e not found: ID does not exist" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.800973 4574 scope.go:117] "RemoveContainer" containerID="f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057" Oct 04 05:11:03 crc kubenswrapper[4574]: E1004 05:11:03.801650 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057\": container with ID starting with f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057 not found: ID does not exist" containerID="f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057" Oct 04 05:11:03 crc kubenswrapper[4574]: I1004 05:11:03.801698 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057"} err="failed to get container status \"f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057\": rpc error: code = NotFound desc = could not find container \"f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057\": container with ID starting with f831e2bf246e6614371a5e49c06c251123ef5125ad605da5b1f9f493d882c057 not found: ID does not exist" Oct 04 05:11:04 crc kubenswrapper[4574]: I1004 05:11:04.699053 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" probeResult="failure" output=< Oct 04 05:11:04 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:11:04 crc kubenswrapper[4574]: > Oct 04 05:11:04 crc kubenswrapper[4574]: I1004 05:11:04.744942 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" path="/var/lib/kubelet/pods/568ba34b-27dc-42bb-9d0e-71db48014ae1/volumes" Oct 04 05:11:14 crc kubenswrapper[4574]: I1004 05:11:14.694872 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" probeResult="failure" output=< Oct 04 05:11:14 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:11:14 crc kubenswrapper[4574]: > Oct 04 05:11:19 crc kubenswrapper[4574]: I1004 05:11:19.405327 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:11:19 crc kubenswrapper[4574]: I1004 05:11:19.405794 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:11:21 crc kubenswrapper[4574]: I1004 05:11:21.869953 4574 scope.go:117] "RemoveContainer" containerID="122da771d0cb8ada6112f2727c6c5cb4446022f39db47a10923c65bbc6fa4965" Oct 04 05:11:21 crc kubenswrapper[4574]: I1004 05:11:21.918438 4574 scope.go:117] "RemoveContainer" containerID="9996ddb2a7c43991e68612cd13d2b5b7095f1eb04d31dbac97dfb32af1bee999" Oct 04 05:11:23 crc kubenswrapper[4574]: I1004 05:11:23.715152 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:11:23 crc kubenswrapper[4574]: I1004 05:11:23.769651 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:11:23 crc kubenswrapper[4574]: I1004 05:11:23.957508 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l8bs4"] Oct 04 05:11:24 crc kubenswrapper[4574]: I1004 05:11:24.878444 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l8bs4" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" containerID="cri-o://f3e1c2d4f00cf884966f3a30e812683119ecf011c56c0ca8d46265dfed225282" gracePeriod=2 Oct 04 05:11:25 crc kubenswrapper[4574]: I1004 05:11:25.904272 4574 generic.go:334] "Generic (PLEG): container finished" podID="30b6fbc3-1578-424c-8b33-a22582f46051" containerID="f3e1c2d4f00cf884966f3a30e812683119ecf011c56c0ca8d46265dfed225282" exitCode=0 Oct 04 05:11:25 crc kubenswrapper[4574]: I1004 05:11:25.904285 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerDied","Data":"f3e1c2d4f00cf884966f3a30e812683119ecf011c56c0ca8d46265dfed225282"} Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.052876 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.128171 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-utilities\") pod \"30b6fbc3-1578-424c-8b33-a22582f46051\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.128487 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jw4w\" (UniqueName: \"kubernetes.io/projected/30b6fbc3-1578-424c-8b33-a22582f46051-kube-api-access-9jw4w\") pod \"30b6fbc3-1578-424c-8b33-a22582f46051\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.128537 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-catalog-content\") pod \"30b6fbc3-1578-424c-8b33-a22582f46051\" (UID: \"30b6fbc3-1578-424c-8b33-a22582f46051\") " Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.129699 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-utilities" (OuterVolumeSpecName: "utilities") pod "30b6fbc3-1578-424c-8b33-a22582f46051" (UID: "30b6fbc3-1578-424c-8b33-a22582f46051"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.134813 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b6fbc3-1578-424c-8b33-a22582f46051-kube-api-access-9jw4w" (OuterVolumeSpecName: "kube-api-access-9jw4w") pod "30b6fbc3-1578-424c-8b33-a22582f46051" (UID: "30b6fbc3-1578-424c-8b33-a22582f46051"). InnerVolumeSpecName "kube-api-access-9jw4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.216918 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30b6fbc3-1578-424c-8b33-a22582f46051" (UID: "30b6fbc3-1578-424c-8b33-a22582f46051"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.230738 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jw4w\" (UniqueName: \"kubernetes.io/projected/30b6fbc3-1578-424c-8b33-a22582f46051-kube-api-access-9jw4w\") on node \"crc\" DevicePath \"\"" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.230788 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.230802 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30b6fbc3-1578-424c-8b33-a22582f46051-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.914416 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8bs4" event={"ID":"30b6fbc3-1578-424c-8b33-a22582f46051","Type":"ContainerDied","Data":"e121425457c0ab688a98d690380306fb42ffa95c6bcbdedb254a06ee1aabea03"} Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.914470 4574 scope.go:117] "RemoveContainer" containerID="f3e1c2d4f00cf884966f3a30e812683119ecf011c56c0ca8d46265dfed225282" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.914470 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8bs4" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.942163 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l8bs4"] Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.949313 4574 scope.go:117] "RemoveContainer" containerID="3a6a6a5e915da3f42e99e63455ff19aa1ede7e3e579a48c30dd3932569394983" Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.955074 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l8bs4"] Oct 04 05:11:26 crc kubenswrapper[4574]: I1004 05:11:26.971393 4574 scope.go:117] "RemoveContainer" containerID="eaf6623c5182f9f6e307515d17e104ca2bb91e0fd4a5f356632ce9128ab8ed85" Oct 04 05:11:28 crc kubenswrapper[4574]: I1004 05:11:28.744774 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" path="/var/lib/kubelet/pods/30b6fbc3-1578-424c-8b33-a22582f46051/volumes" Oct 04 05:11:49 crc kubenswrapper[4574]: I1004 05:11:49.404624 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:11:49 crc kubenswrapper[4574]: I1004 05:11:49.406794 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:11:49 crc kubenswrapper[4574]: I1004 05:11:49.406868 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:11:49 crc kubenswrapper[4574]: I1004 05:11:49.408065 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:11:49 crc kubenswrapper[4574]: I1004 05:11:49.408145 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" gracePeriod=600 Oct 04 05:11:49 crc kubenswrapper[4574]: E1004 05:11:49.538371 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:11:50 crc kubenswrapper[4574]: I1004 05:11:50.141894 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" exitCode=0 Oct 04 05:11:50 crc kubenswrapper[4574]: I1004 05:11:50.141945 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc"} Oct 04 05:11:50 crc kubenswrapper[4574]: I1004 05:11:50.142464 4574 scope.go:117] "RemoveContainer" containerID="422e81dba527fdce5fb46863c657f7a61bb4d0e601b192c209383ffeaf65198f" Oct 04 05:11:50 crc kubenswrapper[4574]: I1004 05:11:50.143382 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:11:50 crc kubenswrapper[4574]: E1004 05:11:50.143751 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:12:03 crc kubenswrapper[4574]: I1004 05:12:03.733881 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:12:03 crc kubenswrapper[4574]: E1004 05:12:03.734827 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:12:15 crc kubenswrapper[4574]: I1004 05:12:15.732899 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:12:15 crc kubenswrapper[4574]: E1004 05:12:15.735036 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:12:22 crc kubenswrapper[4574]: I1004 05:12:22.087134 4574 scope.go:117] "RemoveContainer" containerID="655dae3bd54eaa26e8f66741694c4f05a6d40744481378a1a8f28f6c7e36ea08" Oct 04 05:12:22 crc kubenswrapper[4574]: I1004 05:12:22.128309 4574 scope.go:117] "RemoveContainer" containerID="67d119404faa86d73dc38c89ef684141fecea4762f7fa63a1580749cea1c68c2" Oct 04 05:12:27 crc kubenswrapper[4574]: I1004 05:12:27.733112 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:12:27 crc kubenswrapper[4574]: E1004 05:12:27.733914 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:12:42 crc kubenswrapper[4574]: I1004 05:12:42.734002 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:12:42 crc kubenswrapper[4574]: E1004 05:12:42.736359 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:12:54 crc kubenswrapper[4574]: I1004 05:12:54.740328 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:12:54 crc kubenswrapper[4574]: E1004 05:12:54.741249 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:13:05 crc kubenswrapper[4574]: I1004 05:13:05.733304 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:13:05 crc kubenswrapper[4574]: E1004 05:13:05.734022 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:13:20 crc kubenswrapper[4574]: I1004 05:13:20.733790 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:13:20 crc kubenswrapper[4574]: E1004 05:13:20.734679 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:13:34 crc kubenswrapper[4574]: I1004 05:13:34.741227 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:13:34 crc kubenswrapper[4574]: E1004 05:13:34.742075 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.042695 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ht5jl"] Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.054608 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-9x4kd"] Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.070858 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-69hmk"] Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.081630 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ht5jl"] Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.092404 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-69hmk"] Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.103271 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-9x4kd"] Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.748547 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03382769-56f5-45dc-b69c-099992058074" path="/var/lib/kubelet/pods/03382769-56f5-45dc-b69c-099992058074/volumes" Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.750392 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944a360b-002a-4b31-8222-4a2949291694" path="/var/lib/kubelet/pods/944a360b-002a-4b31-8222-4a2949291694/volumes" Oct 04 05:13:42 crc kubenswrapper[4574]: I1004 05:13:42.751821 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb796fd0-70b3-42d1-8b93-c034287498f6" path="/var/lib/kubelet/pods/fb796fd0-70b3-42d1-8b93-c034287498f6/volumes" Oct 04 05:13:47 crc kubenswrapper[4574]: I1004 05:13:47.732875 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:13:47 crc kubenswrapper[4574]: E1004 05:13:47.733664 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:13:51 crc kubenswrapper[4574]: I1004 05:13:51.034900 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1227-account-create-txzkj"] Oct 04 05:13:51 crc kubenswrapper[4574]: I1004 05:13:51.045294 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1227-account-create-txzkj"] Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.040215 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2a59-account-create-b4x5h"] Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.050833 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-84c7-account-create-bjdjw"] Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.062518 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2a59-account-create-b4x5h"] Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.072098 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-84c7-account-create-bjdjw"] Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.761766 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f863e6-e6a0-44e9-afbe-f4734c7e2416" path="/var/lib/kubelet/pods/b8f863e6-e6a0-44e9-afbe-f4734c7e2416/volumes" Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.764841 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba5c4cf-83d4-4518-9ce5-73dd9d4e0220" path="/var/lib/kubelet/pods/bba5c4cf-83d4-4518-9ce5-73dd9d4e0220/volumes" Oct 04 05:13:52 crc kubenswrapper[4574]: I1004 05:13:52.807504 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3ee3f0-67c6-4a05-aa14-a5602906d364" path="/var/lib/kubelet/pods/cb3ee3f0-67c6-4a05-aa14-a5602906d364/volumes" Oct 04 05:13:58 crc kubenswrapper[4574]: I1004 05:13:58.733197 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:13:58 crc kubenswrapper[4574]: E1004 05:13:58.733990 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:14:03 crc kubenswrapper[4574]: I1004 05:14:03.035119 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-s496q"] Oct 04 05:14:03 crc kubenswrapper[4574]: I1004 05:14:03.046502 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-s496q"] Oct 04 05:14:04 crc kubenswrapper[4574]: I1004 05:14:04.747522 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6440c95c-883f-4d9e-b095-6589637f1059" path="/var/lib/kubelet/pods/6440c95c-883f-4d9e-b095-6589637f1059/volumes" Oct 04 05:14:13 crc kubenswrapper[4574]: I1004 05:14:13.040009 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2ssvl"] Oct 04 05:14:13 crc kubenswrapper[4574]: I1004 05:14:13.051705 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-b96fh"] Oct 04 05:14:13 crc kubenswrapper[4574]: I1004 05:14:13.061802 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2ssvl"] Oct 04 05:14:13 crc kubenswrapper[4574]: I1004 05:14:13.070787 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-b96fh"] Oct 04 05:14:13 crc kubenswrapper[4574]: I1004 05:14:13.733349 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:14:13 crc kubenswrapper[4574]: E1004 05:14:13.733650 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:14:14 crc kubenswrapper[4574]: I1004 05:14:14.748162 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e4ab0-c323-4894-8d2d-23d6e380aeb1" path="/var/lib/kubelet/pods/f80e4ab0-c323-4894-8d2d-23d6e380aeb1/volumes" Oct 04 05:14:14 crc kubenswrapper[4574]: I1004 05:14:14.751268 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f937ae2e-fb2c-4632-b030-e4547cb604bf" path="/var/lib/kubelet/pods/f937ae2e-fb2c-4632-b030-e4547cb604bf/volumes" Oct 04 05:14:15 crc kubenswrapper[4574]: I1004 05:14:15.528053 4574 generic.go:334] "Generic (PLEG): container finished" podID="1e9631eb-d051-4087-81eb-2f33ea4dd993" containerID="049c6977c6f13070e724e84d45ce3ca0cf53c2bd6ec62207afe8c1562ee4311b" exitCode=0 Oct 04 05:14:15 crc kubenswrapper[4574]: I1004 05:14:15.528555 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" event={"ID":"1e9631eb-d051-4087-81eb-2f33ea4dd993","Type":"ContainerDied","Data":"049c6977c6f13070e724e84d45ce3ca0cf53c2bd6ec62207afe8c1562ee4311b"} Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.935311 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.955852 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-bootstrap-combined-ca-bundle\") pod \"1e9631eb-d051-4087-81eb-2f33ea4dd993\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.955999 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-ssh-key\") pod \"1e9631eb-d051-4087-81eb-2f33ea4dd993\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.956043 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnw2t\" (UniqueName: \"kubernetes.io/projected/1e9631eb-d051-4087-81eb-2f33ea4dd993-kube-api-access-vnw2t\") pod \"1e9631eb-d051-4087-81eb-2f33ea4dd993\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.956084 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-inventory\") pod \"1e9631eb-d051-4087-81eb-2f33ea4dd993\" (UID: \"1e9631eb-d051-4087-81eb-2f33ea4dd993\") " Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.964376 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9631eb-d051-4087-81eb-2f33ea4dd993-kube-api-access-vnw2t" (OuterVolumeSpecName: "kube-api-access-vnw2t") pod "1e9631eb-d051-4087-81eb-2f33ea4dd993" (UID: "1e9631eb-d051-4087-81eb-2f33ea4dd993"). InnerVolumeSpecName "kube-api-access-vnw2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:14:16 crc kubenswrapper[4574]: I1004 05:14:16.964936 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1e9631eb-d051-4087-81eb-2f33ea4dd993" (UID: "1e9631eb-d051-4087-81eb-2f33ea4dd993"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.004253 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-inventory" (OuterVolumeSpecName: "inventory") pod "1e9631eb-d051-4087-81eb-2f33ea4dd993" (UID: "1e9631eb-d051-4087-81eb-2f33ea4dd993"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.005854 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1e9631eb-d051-4087-81eb-2f33ea4dd993" (UID: "1e9631eb-d051-4087-81eb-2f33ea4dd993"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.057458 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.057512 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnw2t\" (UniqueName: \"kubernetes.io/projected/1e9631eb-d051-4087-81eb-2f33ea4dd993-kube-api-access-vnw2t\") on node \"crc\" DevicePath \"\"" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.057523 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.057532 4574 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9631eb-d051-4087-81eb-2f33ea4dd993-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.589038 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" event={"ID":"1e9631eb-d051-4087-81eb-2f33ea4dd993","Type":"ContainerDied","Data":"5709a64ce9911bda437b8bc31497015a04e00597251178e58df4c5fc25b6e99b"} Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.589087 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5709a64ce9911bda437b8bc31497015a04e00597251178e58df4c5fc25b6e99b" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.589180 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.684866 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs"] Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685670 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="extract-utilities" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685694 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="extract-utilities" Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685709 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9631eb-d051-4087-81eb-2f33ea4dd993" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685717 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9631eb-d051-4087-81eb-2f33ea4dd993" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685731 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="extract-utilities" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685737 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="extract-utilities" Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685762 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="extract-content" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685767 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="extract-content" Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685779 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="registry-server" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685784 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="registry-server" Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685795 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="extract-content" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685801 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="extract-content" Oct 04 05:14:17 crc kubenswrapper[4574]: E1004 05:14:17.685816 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.685821 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.686039 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9631eb-d051-4087-81eb-2f33ea4dd993" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.686063 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b6fbc3-1578-424c-8b33-a22582f46051" containerName="registry-server" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.686092 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="568ba34b-27dc-42bb-9d0e-71db48014ae1" containerName="registry-server" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.687167 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.693786 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.693860 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.694834 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.695490 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.720980 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs"] Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.785492 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkh2b\" (UniqueName: \"kubernetes.io/projected/b8508613-3769-4000-9037-bce43bf206bb-kube-api-access-xkh2b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.785584 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.785839 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.888053 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.888260 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkh2b\" (UniqueName: \"kubernetes.io/projected/b8508613-3769-4000-9037-bce43bf206bb-kube-api-access-xkh2b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.888307 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.905358 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.905419 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:17 crc kubenswrapper[4574]: I1004 05:14:17.908389 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkh2b\" (UniqueName: \"kubernetes.io/projected/b8508613-3769-4000-9037-bce43bf206bb-kube-api-access-xkh2b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:18 crc kubenswrapper[4574]: I1004 05:14:18.022104 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:14:18 crc kubenswrapper[4574]: I1004 05:14:18.567279 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs"] Oct 04 05:14:18 crc kubenswrapper[4574]: I1004 05:14:18.567555 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:14:18 crc kubenswrapper[4574]: I1004 05:14:18.602940 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" event={"ID":"b8508613-3769-4000-9037-bce43bf206bb","Type":"ContainerStarted","Data":"72b15cfeea6727856ce23f36909adf922dbfd0d28fb6813f47d0190aac7b2177"} Oct 04 05:14:19 crc kubenswrapper[4574]: I1004 05:14:19.038200 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-g5d9n"] Oct 04 05:14:19 crc kubenswrapper[4574]: I1004 05:14:19.048753 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-g5d9n"] Oct 04 05:14:19 crc kubenswrapper[4574]: I1004 05:14:19.613541 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" event={"ID":"b8508613-3769-4000-9037-bce43bf206bb","Type":"ContainerStarted","Data":"6a8d9933f29c59eef0dc78f96bffcf04176ff127a97bf2dda750cbce619f0655"} Oct 04 05:14:19 crc kubenswrapper[4574]: I1004 05:14:19.637672 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" podStartSLOduration=2.221426791 podStartE2EDuration="2.637650774s" podCreationTimestamp="2025-10-04 05:14:17 +0000 UTC" firstStartedPulling="2025-10-04 05:14:18.567069051 +0000 UTC m=+1684.421212103" lastFinishedPulling="2025-10-04 05:14:18.983293044 +0000 UTC m=+1684.837436086" observedRunningTime="2025-10-04 05:14:19.629075026 +0000 UTC m=+1685.483218078" watchObservedRunningTime="2025-10-04 05:14:19.637650774 +0000 UTC m=+1685.491793806" Oct 04 05:14:20 crc kubenswrapper[4574]: I1004 05:14:20.747500 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e6f397-f39a-4b3b-af37-05020e371987" path="/var/lib/kubelet/pods/65e6f397-f39a-4b3b-af37-05020e371987/volumes" Oct 04 05:14:21 crc kubenswrapper[4574]: I1004 05:14:21.025481 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-32a0-account-create-mtnfl"] Oct 04 05:14:21 crc kubenswrapper[4574]: I1004 05:14:21.036293 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-32a0-account-create-mtnfl"] Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.318181 4574 scope.go:117] "RemoveContainer" containerID="78e7609ad41fa3866fe0ad635849f1eb6b025c249893a32cb355686451bf595d" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.371791 4574 scope.go:117] "RemoveContainer" containerID="2d9abd6a53e5fe383c3bcc7fa051c48e4c89ec695bdb630ff2cc65c14c7c5114" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.397506 4574 scope.go:117] "RemoveContainer" containerID="f6d3f86c7280e708740c2665de9fa74c18344b39bdfaede64f0187ba16a7977d" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.452019 4574 scope.go:117] "RemoveContainer" containerID="50a72173a69ddc6eaa1b33fc15981e0f540f181d6f9eb3a580d63c7c6aa335d1" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.507796 4574 scope.go:117] "RemoveContainer" containerID="35de2fbc9f58be3c2711718e28d35dae95b4d2c1baeb9879af535623cad155a7" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.551498 4574 scope.go:117] "RemoveContainer" containerID="06b333b01bed766728865b88c1ab0187ee11cb7871910ac8cedcce76ba626acc" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.593352 4574 scope.go:117] "RemoveContainer" containerID="df7b5541b583b8d5d6dd1d9e1606a1c96f1b8c8717d8e3dd8d38efbb8a446b10" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.625624 4574 scope.go:117] "RemoveContainer" containerID="13c5bc57cfc1b6560ec5815abafe4f86fb7c5bb8683656f772c837524befe1e5" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.685301 4574 scope.go:117] "RemoveContainer" containerID="4ff59bb7c1fff34c989100573e0415b28f453a891764111427b5f099b1090b84" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.709135 4574 scope.go:117] "RemoveContainer" containerID="47aa5f056a57adfb702f38a7cd64aeda8d8aefd0a541963a7d1c7efd53f52b9b" Oct 04 05:14:22 crc kubenswrapper[4574]: I1004 05:14:22.745272 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91fda1d-56f0-41a6-af12-1bf124f53361" path="/var/lib/kubelet/pods/a91fda1d-56f0-41a6-af12-1bf124f53361/volumes" Oct 04 05:14:26 crc kubenswrapper[4574]: I1004 05:14:26.733803 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:14:26 crc kubenswrapper[4574]: E1004 05:14:26.734569 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:14:38 crc kubenswrapper[4574]: I1004 05:14:38.733532 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:14:38 crc kubenswrapper[4574]: E1004 05:14:38.734357 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:14:43 crc kubenswrapper[4574]: I1004 05:14:43.036489 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d026-account-create-lttsc"] Oct 04 05:14:43 crc kubenswrapper[4574]: I1004 05:14:43.045539 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d026-account-create-lttsc"] Oct 04 05:14:44 crc kubenswrapper[4574]: I1004 05:14:44.744726 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c8f6604-3234-44f9-8d5f-24c945ccc8ae" path="/var/lib/kubelet/pods/3c8f6604-3234-44f9-8d5f-24c945ccc8ae/volumes" Oct 04 05:14:45 crc kubenswrapper[4574]: I1004 05:14:45.027793 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-467d-account-create-vc9zc"] Oct 04 05:14:45 crc kubenswrapper[4574]: I1004 05:14:45.037300 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-467d-account-create-vc9zc"] Oct 04 05:14:46 crc kubenswrapper[4574]: I1004 05:14:46.747880 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782fe7ab-749f-434c-ba59-c7ab782dd007" path="/var/lib/kubelet/pods/782fe7ab-749f-434c-ba59-c7ab782dd007/volumes" Oct 04 05:14:50 crc kubenswrapper[4574]: I1004 05:14:50.733223 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:14:50 crc kubenswrapper[4574]: E1004 05:14:50.733796 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.048314 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-kt5kv"] Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.056206 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-75f5m"] Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.064622 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-75f5m"] Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.074782 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-kt5kv"] Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.181881 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7"] Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.184297 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.187699 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.189181 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.192052 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7"] Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.344890 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/902d9f86-3aec-47c7-ae24-d9d2a310e614-secret-volume\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.344932 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4v6f\" (UniqueName: \"kubernetes.io/projected/902d9f86-3aec-47c7-ae24-d9d2a310e614-kube-api-access-g4v6f\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.344984 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/902d9f86-3aec-47c7-ae24-d9d2a310e614-config-volume\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.447042 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/902d9f86-3aec-47c7-ae24-d9d2a310e614-config-volume\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.447579 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/902d9f86-3aec-47c7-ae24-d9d2a310e614-secret-volume\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.447625 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4v6f\" (UniqueName: \"kubernetes.io/projected/902d9f86-3aec-47c7-ae24-d9d2a310e614-kube-api-access-g4v6f\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.448093 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/902d9f86-3aec-47c7-ae24-d9d2a310e614-config-volume\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.459487 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/902d9f86-3aec-47c7-ae24-d9d2a310e614-secret-volume\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.472193 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4v6f\" (UniqueName: \"kubernetes.io/projected/902d9f86-3aec-47c7-ae24-d9d2a310e614-kube-api-access-g4v6f\") pod \"collect-profiles-29325915-wmwd7\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.510684 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.751739 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5" path="/var/lib/kubelet/pods/9bc5f0fd-4780-46b2-bf23-daf4dc9cd3a5/volumes" Oct 04 05:15:00 crc kubenswrapper[4574]: I1004 05:15:00.757026 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3aa519a-57c4-46f2-8467-a4b85930eca7" path="/var/lib/kubelet/pods/a3aa519a-57c4-46f2-8467-a4b85930eca7/volumes" Oct 04 05:15:01 crc kubenswrapper[4574]: I1004 05:15:01.010958 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7"] Oct 04 05:15:01 crc kubenswrapper[4574]: I1004 05:15:01.053363 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" event={"ID":"902d9f86-3aec-47c7-ae24-d9d2a310e614","Type":"ContainerStarted","Data":"318bf3e1c0c805119a4f8d6f7a25440fbea8298ff4e5fd4f027fa90beb19f830"} Oct 04 05:15:02 crc kubenswrapper[4574]: I1004 05:15:02.064754 4574 generic.go:334] "Generic (PLEG): container finished" podID="902d9f86-3aec-47c7-ae24-d9d2a310e614" containerID="040e8cc0f6ef08da601f430adf5591d801e57a000af982b1e1209b996ee9efe4" exitCode=0 Oct 04 05:15:02 crc kubenswrapper[4574]: I1004 05:15:02.064811 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" event={"ID":"902d9f86-3aec-47c7-ae24-d9d2a310e614","Type":"ContainerDied","Data":"040e8cc0f6ef08da601f430adf5591d801e57a000af982b1e1209b996ee9efe4"} Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.401886 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.531087 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4v6f\" (UniqueName: \"kubernetes.io/projected/902d9f86-3aec-47c7-ae24-d9d2a310e614-kube-api-access-g4v6f\") pod \"902d9f86-3aec-47c7-ae24-d9d2a310e614\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.531206 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/902d9f86-3aec-47c7-ae24-d9d2a310e614-config-volume\") pod \"902d9f86-3aec-47c7-ae24-d9d2a310e614\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.531360 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/902d9f86-3aec-47c7-ae24-d9d2a310e614-secret-volume\") pod \"902d9f86-3aec-47c7-ae24-d9d2a310e614\" (UID: \"902d9f86-3aec-47c7-ae24-d9d2a310e614\") " Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.532074 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902d9f86-3aec-47c7-ae24-d9d2a310e614-config-volume" (OuterVolumeSpecName: "config-volume") pod "902d9f86-3aec-47c7-ae24-d9d2a310e614" (UID: "902d9f86-3aec-47c7-ae24-d9d2a310e614"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.537895 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902d9f86-3aec-47c7-ae24-d9d2a310e614-kube-api-access-g4v6f" (OuterVolumeSpecName: "kube-api-access-g4v6f") pod "902d9f86-3aec-47c7-ae24-d9d2a310e614" (UID: "902d9f86-3aec-47c7-ae24-d9d2a310e614"). InnerVolumeSpecName "kube-api-access-g4v6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.538421 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902d9f86-3aec-47c7-ae24-d9d2a310e614-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "902d9f86-3aec-47c7-ae24-d9d2a310e614" (UID: "902d9f86-3aec-47c7-ae24-d9d2a310e614"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.634668 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4v6f\" (UniqueName: \"kubernetes.io/projected/902d9f86-3aec-47c7-ae24-d9d2a310e614-kube-api-access-g4v6f\") on node \"crc\" DevicePath \"\"" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.634718 4574 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/902d9f86-3aec-47c7-ae24-d9d2a310e614-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.634728 4574 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/902d9f86-3aec-47c7-ae24-d9d2a310e614-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:15:03 crc kubenswrapper[4574]: I1004 05:15:03.734016 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:15:03 crc kubenswrapper[4574]: E1004 05:15:03.734343 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:15:04 crc kubenswrapper[4574]: I1004 05:15:04.086697 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" event={"ID":"902d9f86-3aec-47c7-ae24-d9d2a310e614","Type":"ContainerDied","Data":"318bf3e1c0c805119a4f8d6f7a25440fbea8298ff4e5fd4f027fa90beb19f830"} Oct 04 05:15:04 crc kubenswrapper[4574]: I1004 05:15:04.087117 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="318bf3e1c0c805119a4f8d6f7a25440fbea8298ff4e5fd4f027fa90beb19f830" Oct 04 05:15:04 crc kubenswrapper[4574]: I1004 05:15:04.086828 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325915-wmwd7" Oct 04 05:15:15 crc kubenswrapper[4574]: I1004 05:15:15.733146 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:15:15 crc kubenswrapper[4574]: E1004 05:15:15.733985 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:15:17 crc kubenswrapper[4574]: I1004 05:15:17.045083 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8c9r7"] Oct 04 05:15:17 crc kubenswrapper[4574]: I1004 05:15:17.058118 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8c9r7"] Oct 04 05:15:18 crc kubenswrapper[4574]: I1004 05:15:18.744786 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad26bb6b-4342-4bfc-89b0-bb562b16af11" path="/var/lib/kubelet/pods/ad26bb6b-4342-4bfc-89b0-bb562b16af11/volumes" Oct 04 05:15:21 crc kubenswrapper[4574]: I1004 05:15:21.041272 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xnbbl"] Oct 04 05:15:21 crc kubenswrapper[4574]: I1004 05:15:21.052017 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xnbbl"] Oct 04 05:15:22 crc kubenswrapper[4574]: I1004 05:15:22.746422 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59854ff7-fdcf-4a21-9fa6-9ab422be068e" path="/var/lib/kubelet/pods/59854ff7-fdcf-4a21-9fa6-9ab422be068e/volumes" Oct 04 05:15:22 crc kubenswrapper[4574]: I1004 05:15:22.974925 4574 scope.go:117] "RemoveContainer" containerID="427a4532ebf2cdf01c075865843888e50ff79eb65d8671c95d848efd7ede07da" Oct 04 05:15:23 crc kubenswrapper[4574]: I1004 05:15:23.006824 4574 scope.go:117] "RemoveContainer" containerID="462246a6f86b0e742a7f51ccc85daebd966441724a641eb8b101f460d59088ee" Oct 04 05:15:23 crc kubenswrapper[4574]: I1004 05:15:23.063358 4574 scope.go:117] "RemoveContainer" containerID="76b53efba941564fb7f377014e082a0e9e5fd3ce29a21fe59b6f149356f15e1c" Oct 04 05:15:23 crc kubenswrapper[4574]: I1004 05:15:23.108156 4574 scope.go:117] "RemoveContainer" containerID="f7a619fa4e10a5aabdb57f766031e7fb3f2efe761fc03332f7708dc79aacbc5b" Oct 04 05:15:23 crc kubenswrapper[4574]: I1004 05:15:23.171911 4574 scope.go:117] "RemoveContainer" containerID="efdff84fbe8dce694f3912b9dbc26ef881dfefe4a9ad96f15344d202e244ccc2" Oct 04 05:15:23 crc kubenswrapper[4574]: I1004 05:15:23.206733 4574 scope.go:117] "RemoveContainer" containerID="18bd4c80932bd536c1168d16197471fcf990efe13d58a500aafc5596f21a6691" Oct 04 05:15:23 crc kubenswrapper[4574]: I1004 05:15:23.254472 4574 scope.go:117] "RemoveContainer" containerID="255ba195d8c5a3f6521995b6b5d51b8d8c42f900cf7009d5dee3896acc9b68fb" Oct 04 05:15:28 crc kubenswrapper[4574]: I1004 05:15:28.732757 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:15:28 crc kubenswrapper[4574]: E1004 05:15:28.733456 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:15:41 crc kubenswrapper[4574]: I1004 05:15:41.733774 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:15:41 crc kubenswrapper[4574]: E1004 05:15:41.734756 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:15:46 crc kubenswrapper[4574]: I1004 05:15:46.040891 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pms9r"] Oct 04 05:15:46 crc kubenswrapper[4574]: I1004 05:15:46.049505 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pms9r"] Oct 04 05:15:46 crc kubenswrapper[4574]: I1004 05:15:46.743611 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097cde22-53c8-44ef-90c9-7e7dd7c43609" path="/var/lib/kubelet/pods/097cde22-53c8-44ef-90c9-7e7dd7c43609/volumes" Oct 04 05:15:55 crc kubenswrapper[4574]: I1004 05:15:55.734424 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:15:55 crc kubenswrapper[4574]: E1004 05:15:55.736342 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:16:00 crc kubenswrapper[4574]: I1004 05:16:00.626373 4574 generic.go:334] "Generic (PLEG): container finished" podID="b8508613-3769-4000-9037-bce43bf206bb" containerID="6a8d9933f29c59eef0dc78f96bffcf04176ff127a97bf2dda750cbce619f0655" exitCode=0 Oct 04 05:16:00 crc kubenswrapper[4574]: I1004 05:16:00.626648 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" event={"ID":"b8508613-3769-4000-9037-bce43bf206bb","Type":"ContainerDied","Data":"6a8d9933f29c59eef0dc78f96bffcf04176ff127a97bf2dda750cbce619f0655"} Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.041586 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.147088 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-inventory\") pod \"b8508613-3769-4000-9037-bce43bf206bb\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.147349 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkh2b\" (UniqueName: \"kubernetes.io/projected/b8508613-3769-4000-9037-bce43bf206bb-kube-api-access-xkh2b\") pod \"b8508613-3769-4000-9037-bce43bf206bb\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.147397 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-ssh-key\") pod \"b8508613-3769-4000-9037-bce43bf206bb\" (UID: \"b8508613-3769-4000-9037-bce43bf206bb\") " Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.167502 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8508613-3769-4000-9037-bce43bf206bb-kube-api-access-xkh2b" (OuterVolumeSpecName: "kube-api-access-xkh2b") pod "b8508613-3769-4000-9037-bce43bf206bb" (UID: "b8508613-3769-4000-9037-bce43bf206bb"). InnerVolumeSpecName "kube-api-access-xkh2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.176832 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b8508613-3769-4000-9037-bce43bf206bb" (UID: "b8508613-3769-4000-9037-bce43bf206bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.180762 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-inventory" (OuterVolumeSpecName: "inventory") pod "b8508613-3769-4000-9037-bce43bf206bb" (UID: "b8508613-3769-4000-9037-bce43bf206bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.251865 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkh2b\" (UniqueName: \"kubernetes.io/projected/b8508613-3769-4000-9037-bce43bf206bb-kube-api-access-xkh2b\") on node \"crc\" DevicePath \"\"" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.251909 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.254700 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8508613-3769-4000-9037-bce43bf206bb-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.651688 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" event={"ID":"b8508613-3769-4000-9037-bce43bf206bb","Type":"ContainerDied","Data":"72b15cfeea6727856ce23f36909adf922dbfd0d28fb6813f47d0190aac7b2177"} Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.652043 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b15cfeea6727856ce23f36909adf922dbfd0d28fb6813f47d0190aac7b2177" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.651734 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.768473 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x"] Oct 04 05:16:02 crc kubenswrapper[4574]: E1004 05:16:02.773223 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8508613-3769-4000-9037-bce43bf206bb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.773308 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8508613-3769-4000-9037-bce43bf206bb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 04 05:16:02 crc kubenswrapper[4574]: E1004 05:16:02.773358 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902d9f86-3aec-47c7-ae24-d9d2a310e614" containerName="collect-profiles" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.773371 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="902d9f86-3aec-47c7-ae24-d9d2a310e614" containerName="collect-profiles" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.786865 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="902d9f86-3aec-47c7-ae24-d9d2a310e614" containerName="collect-profiles" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.787027 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8508613-3769-4000-9037-bce43bf206bb" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.804068 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x"] Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.804307 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.810363 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.810929 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.811159 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.812299 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.979968 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.980180 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:02 crc kubenswrapper[4574]: I1004 05:16:02.980260 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2pj\" (UniqueName: \"kubernetes.io/projected/75cb1602-ada9-4442-be91-3fa85a464d5a-kube-api-access-qc2pj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.037856 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-khrbr"] Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.047718 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-khrbr"] Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.081632 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.081694 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2pj\" (UniqueName: \"kubernetes.io/projected/75cb1602-ada9-4442-be91-3fa85a464d5a-kube-api-access-qc2pj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.081795 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.087035 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.087082 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.102421 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2pj\" (UniqueName: \"kubernetes.io/projected/75cb1602-ada9-4442-be91-3fa85a464d5a-kube-api-access-qc2pj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vj82x\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.132491 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:16:03 crc kubenswrapper[4574]: I1004 05:16:03.659697 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x"] Oct 04 05:16:04 crc kubenswrapper[4574]: I1004 05:16:04.672690 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" event={"ID":"75cb1602-ada9-4442-be91-3fa85a464d5a","Type":"ContainerStarted","Data":"17333f589d298d540a10b3c21c8f07cd04c3ad2e39a5f807a8043ab26d49c03a"} Oct 04 05:16:04 crc kubenswrapper[4574]: I1004 05:16:04.673247 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" event={"ID":"75cb1602-ada9-4442-be91-3fa85a464d5a","Type":"ContainerStarted","Data":"aaf66ab94dd9c8b5722b2b1a18b590506679a212fef29ade90479f6919d41544"} Oct 04 05:16:04 crc kubenswrapper[4574]: I1004 05:16:04.748713 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd3ebd3-498c-4070-9de7-eab9d2866108" path="/var/lib/kubelet/pods/9bd3ebd3-498c-4070-9de7-eab9d2866108/volumes" Oct 04 05:16:08 crc kubenswrapper[4574]: I1004 05:16:08.733920 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:16:08 crc kubenswrapper[4574]: E1004 05:16:08.734503 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:16:09 crc kubenswrapper[4574]: I1004 05:16:09.029089 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" podStartSLOduration=6.585836291 podStartE2EDuration="7.029066962s" podCreationTimestamp="2025-10-04 05:16:02 +0000 UTC" firstStartedPulling="2025-10-04 05:16:03.656963729 +0000 UTC m=+1789.511106771" lastFinishedPulling="2025-10-04 05:16:04.10019439 +0000 UTC m=+1789.954337442" observedRunningTime="2025-10-04 05:16:04.691033187 +0000 UTC m=+1790.545176219" watchObservedRunningTime="2025-10-04 05:16:09.029066962 +0000 UTC m=+1794.883210014" Oct 04 05:16:09 crc kubenswrapper[4574]: I1004 05:16:09.034903 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tgzwb"] Oct 04 05:16:09 crc kubenswrapper[4574]: I1004 05:16:09.044042 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tgzwb"] Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.028217 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zrf8p"] Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.036042 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tr5bq"] Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.044645 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tr5bq"] Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.056452 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zrf8p"] Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.746054 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="176077ad-74a8-403d-8917-8288171aa8d4" path="/var/lib/kubelet/pods/176077ad-74a8-403d-8917-8288171aa8d4/volumes" Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.747009 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ee193e-b13b-4da7-8f59-a438c1c787c7" path="/var/lib/kubelet/pods/20ee193e-b13b-4da7-8f59-a438c1c787c7/volumes" Oct 04 05:16:10 crc kubenswrapper[4574]: I1004 05:16:10.748812 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5fe7de-58bc-4111-97a0-662294fd048b" path="/var/lib/kubelet/pods/4e5fe7de-58bc-4111-97a0-662294fd048b/volumes" Oct 04 05:16:16 crc kubenswrapper[4574]: I1004 05:16:16.037522 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-86ce-account-create-s4xn9"] Oct 04 05:16:16 crc kubenswrapper[4574]: I1004 05:16:16.044622 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-86ce-account-create-s4xn9"] Oct 04 05:16:16 crc kubenswrapper[4574]: I1004 05:16:16.748572 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0388204-4e2e-4a49-b47a-ef648fba57e8" path="/var/lib/kubelet/pods/a0388204-4e2e-4a49-b47a-ef648fba57e8/volumes" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.422568 4574 scope.go:117] "RemoveContainer" containerID="f29c105623a99459d5fc0228b756b82534fc6a9c427e0d68f6ba478858287c12" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.485233 4574 scope.go:117] "RemoveContainer" containerID="e00a043640637a84e7524626f1b3dbf01348164f357bc15c5c2ccde54fb3dac2" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.529882 4574 scope.go:117] "RemoveContainer" containerID="070d29639d18fbc298dbaa50fd29971e1b6ee2075d540cc014e5f3e1ea5909d0" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.557779 4574 scope.go:117] "RemoveContainer" containerID="89dcdeb9343cf36855d68c34da4b728dd135daa9de25423e41289ba1d794c24b" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.607581 4574 scope.go:117] "RemoveContainer" containerID="b672895c479b816bc6bb6c0e871a1c341c7aad69f9ed2cb29f9501975c4cb909" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.657794 4574 scope.go:117] "RemoveContainer" containerID="890e49b885581aef88f08419103b98fbc7609d4becc49ebf9141c939ed7dc9c8" Oct 04 05:16:23 crc kubenswrapper[4574]: I1004 05:16:23.733815 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:16:23 crc kubenswrapper[4574]: E1004 05:16:23.734079 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:16:26 crc kubenswrapper[4574]: I1004 05:16:26.040734 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b24d-account-create-4x5bk"] Oct 04 05:16:26 crc kubenswrapper[4574]: I1004 05:16:26.049819 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7129-account-create-v7qwd"] Oct 04 05:16:26 crc kubenswrapper[4574]: I1004 05:16:26.056962 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b24d-account-create-4x5bk"] Oct 04 05:16:26 crc kubenswrapper[4574]: I1004 05:16:26.064774 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7129-account-create-v7qwd"] Oct 04 05:16:26 crc kubenswrapper[4574]: I1004 05:16:26.752070 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270e012c-9f36-48e8-8485-b38005557964" path="/var/lib/kubelet/pods/270e012c-9f36-48e8-8485-b38005557964/volumes" Oct 04 05:16:26 crc kubenswrapper[4574]: I1004 05:16:26.752980 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16c650e-b479-4611-b9cf-2085522591c6" path="/var/lib/kubelet/pods/e16c650e-b479-4611-b9cf-2085522591c6/volumes" Oct 04 05:16:37 crc kubenswrapper[4574]: I1004 05:16:37.733871 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:16:37 crc kubenswrapper[4574]: E1004 05:16:37.734656 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:16:51 crc kubenswrapper[4574]: I1004 05:16:51.733283 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:16:52 crc kubenswrapper[4574]: I1004 05:16:52.118620 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"75806abbab232a33158786e912aa0c12443a8b2653e813b4860c08647deedd1b"} Oct 04 05:17:15 crc kubenswrapper[4574]: I1004 05:17:15.062343 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rzg59"] Oct 04 05:17:15 crc kubenswrapper[4574]: I1004 05:17:15.078695 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rzg59"] Oct 04 05:17:16 crc kubenswrapper[4574]: I1004 05:17:16.746030 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232b9769-2677-4ce8-991e-a8b94b2e5de1" path="/var/lib/kubelet/pods/232b9769-2677-4ce8-991e-a8b94b2e5de1/volumes" Oct 04 05:17:20 crc kubenswrapper[4574]: I1004 05:17:20.349757 4574 generic.go:334] "Generic (PLEG): container finished" podID="75cb1602-ada9-4442-be91-3fa85a464d5a" containerID="17333f589d298d540a10b3c21c8f07cd04c3ad2e39a5f807a8043ab26d49c03a" exitCode=0 Oct 04 05:17:20 crc kubenswrapper[4574]: I1004 05:17:20.349871 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" event={"ID":"75cb1602-ada9-4442-be91-3fa85a464d5a","Type":"ContainerDied","Data":"17333f589d298d540a10b3c21c8f07cd04c3ad2e39a5f807a8043ab26d49c03a"} Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.735367 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.907526 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc2pj\" (UniqueName: \"kubernetes.io/projected/75cb1602-ada9-4442-be91-3fa85a464d5a-kube-api-access-qc2pj\") pod \"75cb1602-ada9-4442-be91-3fa85a464d5a\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.907627 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-inventory\") pod \"75cb1602-ada9-4442-be91-3fa85a464d5a\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.907675 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-ssh-key\") pod \"75cb1602-ada9-4442-be91-3fa85a464d5a\" (UID: \"75cb1602-ada9-4442-be91-3fa85a464d5a\") " Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.916509 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cb1602-ada9-4442-be91-3fa85a464d5a-kube-api-access-qc2pj" (OuterVolumeSpecName: "kube-api-access-qc2pj") pod "75cb1602-ada9-4442-be91-3fa85a464d5a" (UID: "75cb1602-ada9-4442-be91-3fa85a464d5a"). InnerVolumeSpecName "kube-api-access-qc2pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.935301 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "75cb1602-ada9-4442-be91-3fa85a464d5a" (UID: "75cb1602-ada9-4442-be91-3fa85a464d5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:17:21 crc kubenswrapper[4574]: I1004 05:17:21.935780 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-inventory" (OuterVolumeSpecName: "inventory") pod "75cb1602-ada9-4442-be91-3fa85a464d5a" (UID: "75cb1602-ada9-4442-be91-3fa85a464d5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.010549 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc2pj\" (UniqueName: \"kubernetes.io/projected/75cb1602-ada9-4442-be91-3fa85a464d5a-kube-api-access-qc2pj\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.010601 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.010610 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75cb1602-ada9-4442-be91-3fa85a464d5a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.118583 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rxgx"] Oct 04 05:17:22 crc kubenswrapper[4574]: E1004 05:17:22.119031 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cb1602-ada9-4442-be91-3fa85a464d5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.119054 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cb1602-ada9-4442-be91-3fa85a464d5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.119413 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cb1602-ada9-4442-be91-3fa85a464d5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.121166 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.127970 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rxgx"] Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.213627 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxt4q\" (UniqueName: \"kubernetes.io/projected/0c9249e5-cace-4400-8e51-bf9441b7c727-kube-api-access-qxt4q\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.214052 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-catalog-content\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.214088 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-utilities\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.317541 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-catalog-content\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.317632 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-utilities\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.317777 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxt4q\" (UniqueName: \"kubernetes.io/projected/0c9249e5-cace-4400-8e51-bf9441b7c727-kube-api-access-qxt4q\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.318211 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-catalog-content\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.318313 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-utilities\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.343111 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxt4q\" (UniqueName: \"kubernetes.io/projected/0c9249e5-cace-4400-8e51-bf9441b7c727-kube-api-access-qxt4q\") pod \"community-operators-4rxgx\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.368784 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" event={"ID":"75cb1602-ada9-4442-be91-3fa85a464d5a","Type":"ContainerDied","Data":"aaf66ab94dd9c8b5722b2b1a18b590506679a212fef29ade90479f6919d41544"} Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.368824 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf66ab94dd9c8b5722b2b1a18b590506679a212fef29ade90479f6919d41544" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.368877 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vj82x" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.457839 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.482074 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd"] Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.483535 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.486903 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.487136 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.487387 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.489511 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.494468 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd"] Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.624455 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.624753 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.624933 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l7nv\" (UniqueName: \"kubernetes.io/projected/b688d23c-d5f8-4fc1-bd58-8e710dae393b-kube-api-access-4l7nv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.726420 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l7nv\" (UniqueName: \"kubernetes.io/projected/b688d23c-d5f8-4fc1-bd58-8e710dae393b-kube-api-access-4l7nv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.726502 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.726556 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.730906 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.731124 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.750103 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l7nv\" (UniqueName: \"kubernetes.io/projected/b688d23c-d5f8-4fc1-bd58-8e710dae393b-kube-api-access-4l7nv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z98sd\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.812985 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:22 crc kubenswrapper[4574]: I1004 05:17:22.926526 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rxgx"] Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.377943 4574 generic.go:334] "Generic (PLEG): container finished" podID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerID="eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb" exitCode=0 Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.378027 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerDied","Data":"eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb"} Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.378187 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerStarted","Data":"ff07a1566cb73ec0c5574eaafb821bccaa08daff23a99c8b26af6c57d8550424"} Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.428611 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd"] Oct 04 05:17:23 crc kubenswrapper[4574]: W1004 05:17:23.436128 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb688d23c_d5f8_4fc1_bd58_8e710dae393b.slice/crio-cea7246ff20174916db01195ee3bb8e0cdb11943154c3a70bb58615d86a5ed73 WatchSource:0}: Error finding container cea7246ff20174916db01195ee3bb8e0cdb11943154c3a70bb58615d86a5ed73: Status 404 returned error can't find the container with id cea7246ff20174916db01195ee3bb8e0cdb11943154c3a70bb58615d86a5ed73 Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.810045 4574 scope.go:117] "RemoveContainer" containerID="215010902fd4fbb967a64b74964562eae1dafab00d2d21ac10e6a2f4f4096bcf" Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.841211 4574 scope.go:117] "RemoveContainer" containerID="8320ead7ee0e60c7cc47942a2b77889a400c7dad2bc1c673bd5b8cfd96dc073d" Oct 04 05:17:23 crc kubenswrapper[4574]: I1004 05:17:23.894069 4574 scope.go:117] "RemoveContainer" containerID="6d46930556fab887ac8159f85f1d1236bb3505d414e9dbb9f4a82b03c276d8f5" Oct 04 05:17:24 crc kubenswrapper[4574]: I1004 05:17:24.387389 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" event={"ID":"b688d23c-d5f8-4fc1-bd58-8e710dae393b","Type":"ContainerStarted","Data":"e07113882a3d4f71ed28dc3f5bf727a50aee978265eedc516f1e00b98f2d0480"} Oct 04 05:17:24 crc kubenswrapper[4574]: I1004 05:17:24.387658 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" event={"ID":"b688d23c-d5f8-4fc1-bd58-8e710dae393b","Type":"ContainerStarted","Data":"cea7246ff20174916db01195ee3bb8e0cdb11943154c3a70bb58615d86a5ed73"} Oct 04 05:17:24 crc kubenswrapper[4574]: I1004 05:17:24.389794 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerStarted","Data":"831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4"} Oct 04 05:17:24 crc kubenswrapper[4574]: I1004 05:17:24.416705 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" podStartSLOduration=1.9295197960000001 podStartE2EDuration="2.416685265s" podCreationTimestamp="2025-10-04 05:17:22 +0000 UTC" firstStartedPulling="2025-10-04 05:17:23.438425544 +0000 UTC m=+1869.292568586" lastFinishedPulling="2025-10-04 05:17:23.925591013 +0000 UTC m=+1869.779734055" observedRunningTime="2025-10-04 05:17:24.411445334 +0000 UTC m=+1870.265588376" watchObservedRunningTime="2025-10-04 05:17:24.416685265 +0000 UTC m=+1870.270828317" Oct 04 05:17:25 crc kubenswrapper[4574]: I1004 05:17:25.409525 4574 generic.go:334] "Generic (PLEG): container finished" podID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerID="831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4" exitCode=0 Oct 04 05:17:25 crc kubenswrapper[4574]: I1004 05:17:25.410013 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerDied","Data":"831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4"} Oct 04 05:17:26 crc kubenswrapper[4574]: I1004 05:17:26.427477 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerStarted","Data":"7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544"} Oct 04 05:17:26 crc kubenswrapper[4574]: I1004 05:17:26.443186 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rxgx" podStartSLOduration=1.9550672420000001 podStartE2EDuration="4.443154492s" podCreationTimestamp="2025-10-04 05:17:22 +0000 UTC" firstStartedPulling="2025-10-04 05:17:23.379714991 +0000 UTC m=+1869.233858033" lastFinishedPulling="2025-10-04 05:17:25.867802241 +0000 UTC m=+1871.721945283" observedRunningTime="2025-10-04 05:17:26.443034979 +0000 UTC m=+1872.297178021" watchObservedRunningTime="2025-10-04 05:17:26.443154492 +0000 UTC m=+1872.297297534" Oct 04 05:17:30 crc kubenswrapper[4574]: I1004 05:17:30.463326 4574 generic.go:334] "Generic (PLEG): container finished" podID="b688d23c-d5f8-4fc1-bd58-8e710dae393b" containerID="e07113882a3d4f71ed28dc3f5bf727a50aee978265eedc516f1e00b98f2d0480" exitCode=0 Oct 04 05:17:30 crc kubenswrapper[4574]: I1004 05:17:30.463356 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" event={"ID":"b688d23c-d5f8-4fc1-bd58-8e710dae393b","Type":"ContainerDied","Data":"e07113882a3d4f71ed28dc3f5bf727a50aee978265eedc516f1e00b98f2d0480"} Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.857344 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.923099 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-ssh-key\") pod \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.923454 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-inventory\") pod \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.923644 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l7nv\" (UniqueName: \"kubernetes.io/projected/b688d23c-d5f8-4fc1-bd58-8e710dae393b-kube-api-access-4l7nv\") pod \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\" (UID: \"b688d23c-d5f8-4fc1-bd58-8e710dae393b\") " Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.929160 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b688d23c-d5f8-4fc1-bd58-8e710dae393b-kube-api-access-4l7nv" (OuterVolumeSpecName: "kube-api-access-4l7nv") pod "b688d23c-d5f8-4fc1-bd58-8e710dae393b" (UID: "b688d23c-d5f8-4fc1-bd58-8e710dae393b"). InnerVolumeSpecName "kube-api-access-4l7nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.956451 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-inventory" (OuterVolumeSpecName: "inventory") pod "b688d23c-d5f8-4fc1-bd58-8e710dae393b" (UID: "b688d23c-d5f8-4fc1-bd58-8e710dae393b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:17:31 crc kubenswrapper[4574]: I1004 05:17:31.956960 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b688d23c-d5f8-4fc1-bd58-8e710dae393b" (UID: "b688d23c-d5f8-4fc1-bd58-8e710dae393b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.025805 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.025846 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l7nv\" (UniqueName: \"kubernetes.io/projected/b688d23c-d5f8-4fc1-bd58-8e710dae393b-kube-api-access-4l7nv\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.025860 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b688d23c-d5f8-4fc1-bd58-8e710dae393b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.458746 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.458795 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.482177 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" event={"ID":"b688d23c-d5f8-4fc1-bd58-8e710dae393b","Type":"ContainerDied","Data":"cea7246ff20174916db01195ee3bb8e0cdb11943154c3a70bb58615d86a5ed73"} Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.482212 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea7246ff20174916db01195ee3bb8e0cdb11943154c3a70bb58615d86a5ed73" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.482285 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z98sd" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.512799 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.571540 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.573271 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk"] Oct 04 05:17:32 crc kubenswrapper[4574]: E1004 05:17:32.573683 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b688d23c-d5f8-4fc1-bd58-8e710dae393b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.573700 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b688d23c-d5f8-4fc1-bd58-8e710dae393b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.573894 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b688d23c-d5f8-4fc1-bd58-8e710dae393b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.582405 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.590202 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.590936 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.592304 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.593227 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.615159 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk"] Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.743506 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.743574 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.743592 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmdz\" (UniqueName: \"kubernetes.io/projected/39389db3-7317-49b0-af09-e9459d02c5e7-kube-api-access-hfmdz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.764904 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rxgx"] Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.845769 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.845847 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.845876 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmdz\" (UniqueName: \"kubernetes.io/projected/39389db3-7317-49b0-af09-e9459d02c5e7-kube-api-access-hfmdz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.851584 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.866671 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmdz\" (UniqueName: \"kubernetes.io/projected/39389db3-7317-49b0-af09-e9459d02c5e7-kube-api-access-hfmdz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.872780 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvnsk\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:32 crc kubenswrapper[4574]: I1004 05:17:32.902924 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:17:33 crc kubenswrapper[4574]: I1004 05:17:33.426128 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk"] Oct 04 05:17:33 crc kubenswrapper[4574]: I1004 05:17:33.491660 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" event={"ID":"39389db3-7317-49b0-af09-e9459d02c5e7","Type":"ContainerStarted","Data":"ef9f627e772db0bf6809106bb9c08c81d17d76ecd8dc823db7b92fa82cb5e12a"} Oct 04 05:17:34 crc kubenswrapper[4574]: I1004 05:17:34.505115 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" event={"ID":"39389db3-7317-49b0-af09-e9459d02c5e7","Type":"ContainerStarted","Data":"7cb5f889ed8f8e7b2e4f66b837789f7fad0b4223bce93624e5811ded96247143"} Oct 04 05:17:34 crc kubenswrapper[4574]: I1004 05:17:34.505473 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rxgx" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="registry-server" containerID="cri-o://7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544" gracePeriod=2 Oct 04 05:17:34 crc kubenswrapper[4574]: I1004 05:17:34.529315 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" podStartSLOduration=2.122556141 podStartE2EDuration="2.529293793s" podCreationTimestamp="2025-10-04 05:17:32 +0000 UTC" firstStartedPulling="2025-10-04 05:17:33.430383917 +0000 UTC m=+1879.284526959" lastFinishedPulling="2025-10-04 05:17:33.837121569 +0000 UTC m=+1879.691264611" observedRunningTime="2025-10-04 05:17:34.522906468 +0000 UTC m=+1880.377049500" watchObservedRunningTime="2025-10-04 05:17:34.529293793 +0000 UTC m=+1880.383436835" Oct 04 05:17:34 crc kubenswrapper[4574]: I1004 05:17:34.932943 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.091595 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-catalog-content\") pod \"0c9249e5-cace-4400-8e51-bf9441b7c727\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.091817 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-utilities\") pod \"0c9249e5-cace-4400-8e51-bf9441b7c727\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.091962 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxt4q\" (UniqueName: \"kubernetes.io/projected/0c9249e5-cace-4400-8e51-bf9441b7c727-kube-api-access-qxt4q\") pod \"0c9249e5-cace-4400-8e51-bf9441b7c727\" (UID: \"0c9249e5-cace-4400-8e51-bf9441b7c727\") " Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.092750 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-utilities" (OuterVolumeSpecName: "utilities") pod "0c9249e5-cace-4400-8e51-bf9441b7c727" (UID: "0c9249e5-cace-4400-8e51-bf9441b7c727"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.105502 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9249e5-cace-4400-8e51-bf9441b7c727-kube-api-access-qxt4q" (OuterVolumeSpecName: "kube-api-access-qxt4q") pod "0c9249e5-cace-4400-8e51-bf9441b7c727" (UID: "0c9249e5-cace-4400-8e51-bf9441b7c727"). InnerVolumeSpecName "kube-api-access-qxt4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.194542 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.194856 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxt4q\" (UniqueName: \"kubernetes.io/projected/0c9249e5-cace-4400-8e51-bf9441b7c727-kube-api-access-qxt4q\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.518819 4574 generic.go:334] "Generic (PLEG): container finished" podID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerID="7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544" exitCode=0 Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.518896 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rxgx" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.518911 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerDied","Data":"7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544"} Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.518969 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rxgx" event={"ID":"0c9249e5-cace-4400-8e51-bf9441b7c727","Type":"ContainerDied","Data":"ff07a1566cb73ec0c5574eaafb821bccaa08daff23a99c8b26af6c57d8550424"} Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.518991 4574 scope.go:117] "RemoveContainer" containerID="7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.552008 4574 scope.go:117] "RemoveContainer" containerID="831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.575332 4574 scope.go:117] "RemoveContainer" containerID="eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.619889 4574 scope.go:117] "RemoveContainer" containerID="7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544" Oct 04 05:17:35 crc kubenswrapper[4574]: E1004 05:17:35.620335 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544\": container with ID starting with 7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544 not found: ID does not exist" containerID="7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.620375 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544"} err="failed to get container status \"7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544\": rpc error: code = NotFound desc = could not find container \"7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544\": container with ID starting with 7f291c87ac6337df2f3e136b7b6f1d512685f2c8f2bda3a749b6b2a3156c0544 not found: ID does not exist" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.620403 4574 scope.go:117] "RemoveContainer" containerID="831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4" Oct 04 05:17:35 crc kubenswrapper[4574]: E1004 05:17:35.620839 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4\": container with ID starting with 831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4 not found: ID does not exist" containerID="831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.620875 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4"} err="failed to get container status \"831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4\": rpc error: code = NotFound desc = could not find container \"831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4\": container with ID starting with 831c0d37ece84902ab9a4c608127a18932e048863870ebf2fd1b682dc273eef4 not found: ID does not exist" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.620888 4574 scope.go:117] "RemoveContainer" containerID="eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb" Oct 04 05:17:35 crc kubenswrapper[4574]: E1004 05:17:35.621324 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb\": container with ID starting with eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb not found: ID does not exist" containerID="eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.621355 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb"} err="failed to get container status \"eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb\": rpc error: code = NotFound desc = could not find container \"eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb\": container with ID starting with eebc737450e20d642022284a84e5c7c95fcc007be100738d2cccfd9e0617abcb not found: ID does not exist" Oct 04 05:17:35 crc kubenswrapper[4574]: I1004 05:17:35.945034 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c9249e5-cace-4400-8e51-bf9441b7c727" (UID: "0c9249e5-cace-4400-8e51-bf9441b7c727"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:17:36 crc kubenswrapper[4574]: I1004 05:17:36.009654 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9249e5-cace-4400-8e51-bf9441b7c727-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:17:36 crc kubenswrapper[4574]: I1004 05:17:36.161615 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rxgx"] Oct 04 05:17:36 crc kubenswrapper[4574]: I1004 05:17:36.174073 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rxgx"] Oct 04 05:17:36 crc kubenswrapper[4574]: I1004 05:17:36.746196 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" path="/var/lib/kubelet/pods/0c9249e5-cace-4400-8e51-bf9441b7c727/volumes" Oct 04 05:17:46 crc kubenswrapper[4574]: I1004 05:17:46.035889 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dxjss"] Oct 04 05:17:46 crc kubenswrapper[4574]: I1004 05:17:46.068733 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dxjss"] Oct 04 05:17:46 crc kubenswrapper[4574]: I1004 05:17:46.748065 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25" path="/var/lib/kubelet/pods/cdfc405a-d6b5-4e7d-9a3a-4d561a10bf25/volumes" Oct 04 05:17:47 crc kubenswrapper[4574]: I1004 05:17:47.041420 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4h6wk"] Oct 04 05:17:47 crc kubenswrapper[4574]: I1004 05:17:47.050333 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4h6wk"] Oct 04 05:17:48 crc kubenswrapper[4574]: I1004 05:17:48.748008 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f813f9d-cacd-47ec-9f90-889f59e98949" path="/var/lib/kubelet/pods/9f813f9d-cacd-47ec-9f90-889f59e98949/volumes" Oct 04 05:18:15 crc kubenswrapper[4574]: E1004 05:18:15.399793 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39389db3_7317_49b0_af09_e9459d02c5e7.slice/crio-conmon-7cb5f889ed8f8e7b2e4f66b837789f7fad0b4223bce93624e5811ded96247143.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:18:15 crc kubenswrapper[4574]: I1004 05:18:15.855407 4574 generic.go:334] "Generic (PLEG): container finished" podID="39389db3-7317-49b0-af09-e9459d02c5e7" containerID="7cb5f889ed8f8e7b2e4f66b837789f7fad0b4223bce93624e5811ded96247143" exitCode=0 Oct 04 05:18:15 crc kubenswrapper[4574]: I1004 05:18:15.855470 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" event={"ID":"39389db3-7317-49b0-af09-e9459d02c5e7","Type":"ContainerDied","Data":"7cb5f889ed8f8e7b2e4f66b837789f7fad0b4223bce93624e5811ded96247143"} Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.246604 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.323208 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfmdz\" (UniqueName: \"kubernetes.io/projected/39389db3-7317-49b0-af09-e9459d02c5e7-kube-api-access-hfmdz\") pod \"39389db3-7317-49b0-af09-e9459d02c5e7\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.323459 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-ssh-key\") pod \"39389db3-7317-49b0-af09-e9459d02c5e7\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.323680 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-inventory\") pod \"39389db3-7317-49b0-af09-e9459d02c5e7\" (UID: \"39389db3-7317-49b0-af09-e9459d02c5e7\") " Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.333755 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39389db3-7317-49b0-af09-e9459d02c5e7-kube-api-access-hfmdz" (OuterVolumeSpecName: "kube-api-access-hfmdz") pod "39389db3-7317-49b0-af09-e9459d02c5e7" (UID: "39389db3-7317-49b0-af09-e9459d02c5e7"). InnerVolumeSpecName "kube-api-access-hfmdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.360467 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39389db3-7317-49b0-af09-e9459d02c5e7" (UID: "39389db3-7317-49b0-af09-e9459d02c5e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.365715 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-inventory" (OuterVolumeSpecName: "inventory") pod "39389db3-7317-49b0-af09-e9459d02c5e7" (UID: "39389db3-7317-49b0-af09-e9459d02c5e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.427354 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfmdz\" (UniqueName: \"kubernetes.io/projected/39389db3-7317-49b0-af09-e9459d02c5e7-kube-api-access-hfmdz\") on node \"crc\" DevicePath \"\"" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.428380 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.428570 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39389db3-7317-49b0-af09-e9459d02c5e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.875293 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" event={"ID":"39389db3-7317-49b0-af09-e9459d02c5e7","Type":"ContainerDied","Data":"ef9f627e772db0bf6809106bb9c08c81d17d76ecd8dc823db7b92fa82cb5e12a"} Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.875335 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef9f627e772db0bf6809106bb9c08c81d17d76ecd8dc823db7b92fa82cb5e12a" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.875387 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvnsk" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.969340 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr"] Oct 04 05:18:17 crc kubenswrapper[4574]: E1004 05:18:17.969781 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39389db3-7317-49b0-af09-e9459d02c5e7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.969806 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="39389db3-7317-49b0-af09-e9459d02c5e7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:18:17 crc kubenswrapper[4574]: E1004 05:18:17.969816 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="registry-server" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.969824 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="registry-server" Oct 04 05:18:17 crc kubenswrapper[4574]: E1004 05:18:17.969843 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="extract-utilities" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.969850 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="extract-utilities" Oct 04 05:18:17 crc kubenswrapper[4574]: E1004 05:18:17.969861 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="extract-content" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.969867 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="extract-content" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.970082 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9249e5-cace-4400-8e51-bf9441b7c727" containerName="registry-server" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.970109 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="39389db3-7317-49b0-af09-e9459d02c5e7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.971067 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.977732 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.977996 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.979496 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.988141 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:18:17 crc kubenswrapper[4574]: I1004 05:18:17.992888 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr"] Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.044481 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.045588 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.045884 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6h67\" (UniqueName: \"kubernetes.io/projected/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-kube-api-access-s6h67\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.148744 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.148891 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6h67\" (UniqueName: \"kubernetes.io/projected/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-kube-api-access-s6h67\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.149007 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.157965 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.162174 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.168908 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6h67\" (UniqueName: \"kubernetes.io/projected/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-kube-api-access-s6h67\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-grdlr\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.291614 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.856358 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr"] Oct 04 05:18:18 crc kubenswrapper[4574]: I1004 05:18:18.885568 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" event={"ID":"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad","Type":"ContainerStarted","Data":"c8daaa6c498b4e336a90624c82664ed979b32543880279b989660cfd118b7363"} Oct 04 05:18:19 crc kubenswrapper[4574]: I1004 05:18:19.894829 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" event={"ID":"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad","Type":"ContainerStarted","Data":"8a878ad7c4d7f9280ee948c95b1f223634820fd1e40645850ec333fd9e33f005"} Oct 04 05:18:19 crc kubenswrapper[4574]: I1004 05:18:19.915384 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" podStartSLOduration=2.298636282 podStartE2EDuration="2.915366105s" podCreationTimestamp="2025-10-04 05:18:17 +0000 UTC" firstStartedPulling="2025-10-04 05:18:18.866282759 +0000 UTC m=+1924.720425801" lastFinishedPulling="2025-10-04 05:18:19.483012582 +0000 UTC m=+1925.337155624" observedRunningTime="2025-10-04 05:18:19.913929643 +0000 UTC m=+1925.768072685" watchObservedRunningTime="2025-10-04 05:18:19.915366105 +0000 UTC m=+1925.769509147" Oct 04 05:18:24 crc kubenswrapper[4574]: I1004 05:18:24.067703 4574 scope.go:117] "RemoveContainer" containerID="fde8a19cb416d9c2c9e7e3823e81f0ffe0b062d645c96114e137d08076b30e1f" Oct 04 05:18:24 crc kubenswrapper[4574]: I1004 05:18:24.103465 4574 scope.go:117] "RemoveContainer" containerID="df47e842772177fbb92cc2c8c3479bd7afc8882a380feeba9001cf3c22dbe611" Oct 04 05:18:29 crc kubenswrapper[4574]: I1004 05:18:29.036438 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ksgc5"] Oct 04 05:18:29 crc kubenswrapper[4574]: I1004 05:18:29.045063 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ksgc5"] Oct 04 05:18:30 crc kubenswrapper[4574]: I1004 05:18:30.744225 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be85814-868f-4899-b745-70da2af4c50a" path="/var/lib/kubelet/pods/1be85814-868f-4899-b745-70da2af4c50a/volumes" Oct 04 05:19:18 crc kubenswrapper[4574]: I1004 05:19:18.411584 4574 generic.go:334] "Generic (PLEG): container finished" podID="9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" containerID="8a878ad7c4d7f9280ee948c95b1f223634820fd1e40645850ec333fd9e33f005" exitCode=2 Oct 04 05:19:18 crc kubenswrapper[4574]: I1004 05:19:18.411653 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" event={"ID":"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad","Type":"ContainerDied","Data":"8a878ad7c4d7f9280ee948c95b1f223634820fd1e40645850ec333fd9e33f005"} Oct 04 05:19:19 crc kubenswrapper[4574]: I1004 05:19:19.404880 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:19:19 crc kubenswrapper[4574]: I1004 05:19:19.405315 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:19:19 crc kubenswrapper[4574]: I1004 05:19:19.888855 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.006388 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6h67\" (UniqueName: \"kubernetes.io/projected/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-kube-api-access-s6h67\") pod \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.006577 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-inventory\") pod \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.006699 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-ssh-key\") pod \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\" (UID: \"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad\") " Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.013481 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-kube-api-access-s6h67" (OuterVolumeSpecName: "kube-api-access-s6h67") pod "9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" (UID: "9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad"). InnerVolumeSpecName "kube-api-access-s6h67". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.036468 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-inventory" (OuterVolumeSpecName: "inventory") pod "9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" (UID: "9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.038100 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" (UID: "9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.109413 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.109444 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.109455 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6h67\" (UniqueName: \"kubernetes.io/projected/9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad-kube-api-access-s6h67\") on node \"crc\" DevicePath \"\"" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.433538 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" event={"ID":"9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad","Type":"ContainerDied","Data":"c8daaa6c498b4e336a90624c82664ed979b32543880279b989660cfd118b7363"} Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.433592 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8daaa6c498b4e336a90624c82664ed979b32543880279b989660cfd118b7363" Oct 04 05:19:20 crc kubenswrapper[4574]: I1004 05:19:20.433596 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-grdlr" Oct 04 05:19:24 crc kubenswrapper[4574]: I1004 05:19:24.268498 4574 scope.go:117] "RemoveContainer" containerID="43f3dd5edcae0cf9cc8c87da04851f932be3b730fa7963bb0317412267235176" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.033340 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256"] Oct 04 05:19:28 crc kubenswrapper[4574]: E1004 05:19:28.033995 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.034008 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.034214 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.036743 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.039724 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.039936 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.043162 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.048320 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.050687 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256"] Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.172702 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4pxc\" (UniqueName: \"kubernetes.io/projected/da9c2287-2920-4152-bf57-7eb8effbea81-kube-api-access-d4pxc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.172818 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.172861 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.274159 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4pxc\" (UniqueName: \"kubernetes.io/projected/da9c2287-2920-4152-bf57-7eb8effbea81-kube-api-access-d4pxc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.274286 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.274318 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.286641 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.286658 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.290006 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4pxc\" (UniqueName: \"kubernetes.io/projected/da9c2287-2920-4152-bf57-7eb8effbea81-kube-api-access-d4pxc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-65256\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.357862 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.899530 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256"] Oct 04 05:19:28 crc kubenswrapper[4574]: I1004 05:19:28.908841 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:19:29 crc kubenswrapper[4574]: I1004 05:19:29.518456 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" event={"ID":"da9c2287-2920-4152-bf57-7eb8effbea81","Type":"ContainerStarted","Data":"3a4693d4b46cac6bf8db9a6c4ff44d85642a79620950f8b929e9bd9bec4f6f32"} Oct 04 05:19:30 crc kubenswrapper[4574]: I1004 05:19:30.529526 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" event={"ID":"da9c2287-2920-4152-bf57-7eb8effbea81","Type":"ContainerStarted","Data":"503042f8fdc08d3aa9f7effb063b23c642c0596a8eb392dd28bb48b3997d457a"} Oct 04 05:19:30 crc kubenswrapper[4574]: I1004 05:19:30.548123 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" podStartSLOduration=2.0366947460000002 podStartE2EDuration="2.548099454s" podCreationTimestamp="2025-10-04 05:19:28 +0000 UTC" firstStartedPulling="2025-10-04 05:19:28.908600635 +0000 UTC m=+1994.762743677" lastFinishedPulling="2025-10-04 05:19:29.420005343 +0000 UTC m=+1995.274148385" observedRunningTime="2025-10-04 05:19:30.545975042 +0000 UTC m=+1996.400118084" watchObservedRunningTime="2025-10-04 05:19:30.548099454 +0000 UTC m=+1996.402242506" Oct 04 05:19:49 crc kubenswrapper[4574]: I1004 05:19:49.404920 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:19:49 crc kubenswrapper[4574]: I1004 05:19:49.405513 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:20:17 crc kubenswrapper[4574]: I1004 05:20:17.925011 4574 generic.go:334] "Generic (PLEG): container finished" podID="da9c2287-2920-4152-bf57-7eb8effbea81" containerID="503042f8fdc08d3aa9f7effb063b23c642c0596a8eb392dd28bb48b3997d457a" exitCode=0 Oct 04 05:20:17 crc kubenswrapper[4574]: I1004 05:20:17.925103 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" event={"ID":"da9c2287-2920-4152-bf57-7eb8effbea81","Type":"ContainerDied","Data":"503042f8fdc08d3aa9f7effb063b23c642c0596a8eb392dd28bb48b3997d457a"} Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.375814 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.400987 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4pxc\" (UniqueName: \"kubernetes.io/projected/da9c2287-2920-4152-bf57-7eb8effbea81-kube-api-access-d4pxc\") pod \"da9c2287-2920-4152-bf57-7eb8effbea81\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.401079 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-inventory\") pod \"da9c2287-2920-4152-bf57-7eb8effbea81\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.401209 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-ssh-key\") pod \"da9c2287-2920-4152-bf57-7eb8effbea81\" (UID: \"da9c2287-2920-4152-bf57-7eb8effbea81\") " Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.404499 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.404565 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.404617 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.405437 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75806abbab232a33158786e912aa0c12443a8b2653e813b4860c08647deedd1b"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.405534 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://75806abbab232a33158786e912aa0c12443a8b2653e813b4860c08647deedd1b" gracePeriod=600 Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.414681 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9c2287-2920-4152-bf57-7eb8effbea81-kube-api-access-d4pxc" (OuterVolumeSpecName: "kube-api-access-d4pxc") pod "da9c2287-2920-4152-bf57-7eb8effbea81" (UID: "da9c2287-2920-4152-bf57-7eb8effbea81"). InnerVolumeSpecName "kube-api-access-d4pxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.436354 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da9c2287-2920-4152-bf57-7eb8effbea81" (UID: "da9c2287-2920-4152-bf57-7eb8effbea81"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.449601 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-inventory" (OuterVolumeSpecName: "inventory") pod "da9c2287-2920-4152-bf57-7eb8effbea81" (UID: "da9c2287-2920-4152-bf57-7eb8effbea81"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.503257 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.503300 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4pxc\" (UniqueName: \"kubernetes.io/projected/da9c2287-2920-4152-bf57-7eb8effbea81-kube-api-access-d4pxc\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.503332 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da9c2287-2920-4152-bf57-7eb8effbea81-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.943777 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" event={"ID":"da9c2287-2920-4152-bf57-7eb8effbea81","Type":"ContainerDied","Data":"3a4693d4b46cac6bf8db9a6c4ff44d85642a79620950f8b929e9bd9bec4f6f32"} Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.944102 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a4693d4b46cac6bf8db9a6c4ff44d85642a79620950f8b929e9bd9bec4f6f32" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.943821 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-65256" Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.953423 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="75806abbab232a33158786e912aa0c12443a8b2653e813b4860c08647deedd1b" exitCode=0 Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.953468 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"75806abbab232a33158786e912aa0c12443a8b2653e813b4860c08647deedd1b"} Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.953495 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea"} Oct 04 05:20:19 crc kubenswrapper[4574]: I1004 05:20:19.953512 4574 scope.go:117] "RemoveContainer" containerID="ebded590cdaf45589ef7dbeffb9d22f46108c219b808c7b52100b454492ca6fc" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.044537 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5hpm9"] Oct 04 05:20:20 crc kubenswrapper[4574]: E1004 05:20:20.045028 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c2287-2920-4152-bf57-7eb8effbea81" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.045046 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c2287-2920-4152-bf57-7eb8effbea81" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.045282 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c2287-2920-4152-bf57-7eb8effbea81" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.045940 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.047642 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.047873 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.047992 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.049977 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.062277 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5hpm9"] Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.115097 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7jhf\" (UniqueName: \"kubernetes.io/projected/b51be97d-af6b-432b-a671-040de2d05471-kube-api-access-v7jhf\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.115180 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.115276 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.217169 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7jhf\" (UniqueName: \"kubernetes.io/projected/b51be97d-af6b-432b-a671-040de2d05471-kube-api-access-v7jhf\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.217714 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.217911 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.226076 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.226504 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.236969 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7jhf\" (UniqueName: \"kubernetes.io/projected/b51be97d-af6b-432b-a671-040de2d05471-kube-api-access-v7jhf\") pod \"ssh-known-hosts-edpm-deployment-5hpm9\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.365866 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.936804 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5hpm9"] Oct 04 05:20:20 crc kubenswrapper[4574]: W1004 05:20:20.939355 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb51be97d_af6b_432b_a671_040de2d05471.slice/crio-a5ea892cb4ee05a49033694cb89535927e6bdbab44c673a3b5db78e1a14b7e95 WatchSource:0}: Error finding container a5ea892cb4ee05a49033694cb89535927e6bdbab44c673a3b5db78e1a14b7e95: Status 404 returned error can't find the container with id a5ea892cb4ee05a49033694cb89535927e6bdbab44c673a3b5db78e1a14b7e95 Oct 04 05:20:20 crc kubenswrapper[4574]: I1004 05:20:20.966033 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" event={"ID":"b51be97d-af6b-432b-a671-040de2d05471","Type":"ContainerStarted","Data":"a5ea892cb4ee05a49033694cb89535927e6bdbab44c673a3b5db78e1a14b7e95"} Oct 04 05:20:21 crc kubenswrapper[4574]: I1004 05:20:21.977352 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" event={"ID":"b51be97d-af6b-432b-a671-040de2d05471","Type":"ContainerStarted","Data":"8b452b6e61e25d3afcef287117d77a075281972a5e60748910f80d70363ccf79"} Oct 04 05:20:21 crc kubenswrapper[4574]: I1004 05:20:21.997483 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" podStartSLOduration=1.5529412420000002 podStartE2EDuration="1.997464766s" podCreationTimestamp="2025-10-04 05:20:20 +0000 UTC" firstStartedPulling="2025-10-04 05:20:20.941567263 +0000 UTC m=+2046.795710305" lastFinishedPulling="2025-10-04 05:20:21.386090787 +0000 UTC m=+2047.240233829" observedRunningTime="2025-10-04 05:20:21.99447896 +0000 UTC m=+2047.848622002" watchObservedRunningTime="2025-10-04 05:20:21.997464766 +0000 UTC m=+2047.851607808" Oct 04 05:20:28 crc kubenswrapper[4574]: E1004 05:20:28.652642 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb51be97d_af6b_432b_a671_040de2d05471.slice/crio-8b452b6e61e25d3afcef287117d77a075281972a5e60748910f80d70363ccf79.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:20:29 crc kubenswrapper[4574]: I1004 05:20:29.040482 4574 generic.go:334] "Generic (PLEG): container finished" podID="b51be97d-af6b-432b-a671-040de2d05471" containerID="8b452b6e61e25d3afcef287117d77a075281972a5e60748910f80d70363ccf79" exitCode=0 Oct 04 05:20:29 crc kubenswrapper[4574]: I1004 05:20:29.040533 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" event={"ID":"b51be97d-af6b-432b-a671-040de2d05471","Type":"ContainerDied","Data":"8b452b6e61e25d3afcef287117d77a075281972a5e60748910f80d70363ccf79"} Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.499330 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.633771 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-ssh-key-openstack-edpm-ipam\") pod \"b51be97d-af6b-432b-a671-040de2d05471\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.633980 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7jhf\" (UniqueName: \"kubernetes.io/projected/b51be97d-af6b-432b-a671-040de2d05471-kube-api-access-v7jhf\") pod \"b51be97d-af6b-432b-a671-040de2d05471\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.634083 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-inventory-0\") pod \"b51be97d-af6b-432b-a671-040de2d05471\" (UID: \"b51be97d-af6b-432b-a671-040de2d05471\") " Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.641043 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51be97d-af6b-432b-a671-040de2d05471-kube-api-access-v7jhf" (OuterVolumeSpecName: "kube-api-access-v7jhf") pod "b51be97d-af6b-432b-a671-040de2d05471" (UID: "b51be97d-af6b-432b-a671-040de2d05471"). InnerVolumeSpecName "kube-api-access-v7jhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.663335 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b51be97d-af6b-432b-a671-040de2d05471" (UID: "b51be97d-af6b-432b-a671-040de2d05471"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.671574 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b51be97d-af6b-432b-a671-040de2d05471" (UID: "b51be97d-af6b-432b-a671-040de2d05471"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.737321 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7jhf\" (UniqueName: \"kubernetes.io/projected/b51be97d-af6b-432b-a671-040de2d05471-kube-api-access-v7jhf\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.737357 4574 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:30 crc kubenswrapper[4574]: I1004 05:20:30.737370 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51be97d-af6b-432b-a671-040de2d05471-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.060712 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" event={"ID":"b51be97d-af6b-432b-a671-040de2d05471","Type":"ContainerDied","Data":"a5ea892cb4ee05a49033694cb89535927e6bdbab44c673a3b5db78e1a14b7e95"} Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.060756 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ea892cb4ee05a49033694cb89535927e6bdbab44c673a3b5db78e1a14b7e95" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.060810 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5hpm9" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.165179 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt"] Oct 04 05:20:31 crc kubenswrapper[4574]: E1004 05:20:31.166287 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51be97d-af6b-432b-a671-040de2d05471" containerName="ssh-known-hosts-edpm-deployment" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.166437 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51be97d-af6b-432b-a671-040de2d05471" containerName="ssh-known-hosts-edpm-deployment" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.166781 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51be97d-af6b-432b-a671-040de2d05471" containerName="ssh-known-hosts-edpm-deployment" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.167714 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.170437 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.170989 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.171212 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.172184 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.177528 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt"] Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.246162 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngwk\" (UniqueName: \"kubernetes.io/projected/0cad2098-82fe-4efb-89a6-a440ad6f73dc-kube-api-access-kngwk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.246224 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.246505 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.348917 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.349459 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kngwk\" (UniqueName: \"kubernetes.io/projected/0cad2098-82fe-4efb-89a6-a440ad6f73dc-kube-api-access-kngwk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.349606 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.360205 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.360227 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.379333 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngwk\" (UniqueName: \"kubernetes.io/projected/0cad2098-82fe-4efb-89a6-a440ad6f73dc-kube-api-access-kngwk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jr6kt\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:31 crc kubenswrapper[4574]: I1004 05:20:31.490762 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:32 crc kubenswrapper[4574]: I1004 05:20:32.013451 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt"] Oct 04 05:20:32 crc kubenswrapper[4574]: I1004 05:20:32.070161 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" event={"ID":"0cad2098-82fe-4efb-89a6-a440ad6f73dc","Type":"ContainerStarted","Data":"ef994b8adec54bbbddcf335a2fb1f91127d3951a6c47d6a81f4fbc7d52bf722d"} Oct 04 05:20:33 crc kubenswrapper[4574]: I1004 05:20:33.080247 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" event={"ID":"0cad2098-82fe-4efb-89a6-a440ad6f73dc","Type":"ContainerStarted","Data":"a980ba8044a85a1058f07d5452c1ca996411638b3fbc9c9c3e3b504176770381"} Oct 04 05:20:33 crc kubenswrapper[4574]: I1004 05:20:33.107170 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" podStartSLOduration=1.639404321 podStartE2EDuration="2.107151057s" podCreationTimestamp="2025-10-04 05:20:31 +0000 UTC" firstStartedPulling="2025-10-04 05:20:32.021128574 +0000 UTC m=+2057.875271616" lastFinishedPulling="2025-10-04 05:20:32.48887532 +0000 UTC m=+2058.343018352" observedRunningTime="2025-10-04 05:20:33.096592342 +0000 UTC m=+2058.950735384" watchObservedRunningTime="2025-10-04 05:20:33.107151057 +0000 UTC m=+2058.961294099" Oct 04 05:20:41 crc kubenswrapper[4574]: I1004 05:20:41.142716 4574 generic.go:334] "Generic (PLEG): container finished" podID="0cad2098-82fe-4efb-89a6-a440ad6f73dc" containerID="a980ba8044a85a1058f07d5452c1ca996411638b3fbc9c9c3e3b504176770381" exitCode=0 Oct 04 05:20:41 crc kubenswrapper[4574]: I1004 05:20:41.142854 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" event={"ID":"0cad2098-82fe-4efb-89a6-a440ad6f73dc","Type":"ContainerDied","Data":"a980ba8044a85a1058f07d5452c1ca996411638b3fbc9c9c3e3b504176770381"} Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.468183 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-67mbj"] Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.474093 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.496842 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67mbj"] Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.562126 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-utilities\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.562219 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh67j\" (UniqueName: \"kubernetes.io/projected/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-kube-api-access-lh67j\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.562383 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-catalog-content\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.590730 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.667823 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh67j\" (UniqueName: \"kubernetes.io/projected/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-kube-api-access-lh67j\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.667957 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-catalog-content\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.668017 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-utilities\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.668452 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-catalog-content\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.668491 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-utilities\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.692433 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh67j\" (UniqueName: \"kubernetes.io/projected/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-kube-api-access-lh67j\") pod \"certified-operators-67mbj\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.769111 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-inventory\") pod \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.769214 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kngwk\" (UniqueName: \"kubernetes.io/projected/0cad2098-82fe-4efb-89a6-a440ad6f73dc-kube-api-access-kngwk\") pod \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.769303 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-ssh-key\") pod \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\" (UID: \"0cad2098-82fe-4efb-89a6-a440ad6f73dc\") " Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.773357 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cad2098-82fe-4efb-89a6-a440ad6f73dc-kube-api-access-kngwk" (OuterVolumeSpecName: "kube-api-access-kngwk") pod "0cad2098-82fe-4efb-89a6-a440ad6f73dc" (UID: "0cad2098-82fe-4efb-89a6-a440ad6f73dc"). InnerVolumeSpecName "kube-api-access-kngwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.797023 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0cad2098-82fe-4efb-89a6-a440ad6f73dc" (UID: "0cad2098-82fe-4efb-89a6-a440ad6f73dc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.814351 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-inventory" (OuterVolumeSpecName: "inventory") pod "0cad2098-82fe-4efb-89a6-a440ad6f73dc" (UID: "0cad2098-82fe-4efb-89a6-a440ad6f73dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.871985 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kngwk\" (UniqueName: \"kubernetes.io/projected/0cad2098-82fe-4efb-89a6-a440ad6f73dc-kube-api-access-kngwk\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.872030 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.872043 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cad2098-82fe-4efb-89a6-a440ad6f73dc-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:42 crc kubenswrapper[4574]: I1004 05:20:42.884895 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.166811 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" event={"ID":"0cad2098-82fe-4efb-89a6-a440ad6f73dc","Type":"ContainerDied","Data":"ef994b8adec54bbbddcf335a2fb1f91127d3951a6c47d6a81f4fbc7d52bf722d"} Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.167123 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef994b8adec54bbbddcf335a2fb1f91127d3951a6c47d6a81f4fbc7d52bf722d" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.167199 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jr6kt" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.281431 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6"] Oct 04 05:20:43 crc kubenswrapper[4574]: E1004 05:20:43.281950 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cad2098-82fe-4efb-89a6-a440ad6f73dc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.281963 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cad2098-82fe-4efb-89a6-a440ad6f73dc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.282159 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cad2098-82fe-4efb-89a6-a440ad6f73dc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.282847 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.287743 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.287949 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.288095 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.288673 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.290305 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6"] Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.381455 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qwv\" (UniqueName: \"kubernetes.io/projected/eba11170-e0cf-4e7a-8e9a-771fde74bff1-kube-api-access-h5qwv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.381575 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.381614 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.468347 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67mbj"] Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.483018 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.483065 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.483158 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qwv\" (UniqueName: \"kubernetes.io/projected/eba11170-e0cf-4e7a-8e9a-771fde74bff1-kube-api-access-h5qwv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.488753 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.488789 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.499803 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qwv\" (UniqueName: \"kubernetes.io/projected/eba11170-e0cf-4e7a-8e9a-771fde74bff1-kube-api-access-h5qwv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:43 crc kubenswrapper[4574]: I1004 05:20:43.607445 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:44 crc kubenswrapper[4574]: I1004 05:20:44.097681 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6"] Oct 04 05:20:44 crc kubenswrapper[4574]: W1004 05:20:44.100371 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba11170_e0cf_4e7a_8e9a_771fde74bff1.slice/crio-2dc8bdc4d42797da1a446b290a8dc816514870e0b2b785aee8e755a016a7ec05 WatchSource:0}: Error finding container 2dc8bdc4d42797da1a446b290a8dc816514870e0b2b785aee8e755a016a7ec05: Status 404 returned error can't find the container with id 2dc8bdc4d42797da1a446b290a8dc816514870e0b2b785aee8e755a016a7ec05 Oct 04 05:20:44 crc kubenswrapper[4574]: I1004 05:20:44.177811 4574 generic.go:334] "Generic (PLEG): container finished" podID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerID="2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55" exitCode=0 Oct 04 05:20:44 crc kubenswrapper[4574]: I1004 05:20:44.177918 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerDied","Data":"2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55"} Oct 04 05:20:44 crc kubenswrapper[4574]: I1004 05:20:44.178198 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerStarted","Data":"8fa202755be9841ca653348d2764ab70373f971ac7b6ab383e16097d82e84f30"} Oct 04 05:20:44 crc kubenswrapper[4574]: I1004 05:20:44.179388 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" event={"ID":"eba11170-e0cf-4e7a-8e9a-771fde74bff1","Type":"ContainerStarted","Data":"2dc8bdc4d42797da1a446b290a8dc816514870e0b2b785aee8e755a016a7ec05"} Oct 04 05:20:45 crc kubenswrapper[4574]: I1004 05:20:45.190518 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerStarted","Data":"fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d"} Oct 04 05:20:45 crc kubenswrapper[4574]: I1004 05:20:45.197366 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" event={"ID":"eba11170-e0cf-4e7a-8e9a-771fde74bff1","Type":"ContainerStarted","Data":"b0bf23a324ef5516856fbbfd46cf2ed0d95c7abd865e3c1db58f5861ca06522e"} Oct 04 05:20:45 crc kubenswrapper[4574]: I1004 05:20:45.241228 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" podStartSLOduration=1.791123345 podStartE2EDuration="2.24119896s" podCreationTimestamp="2025-10-04 05:20:43 +0000 UTC" firstStartedPulling="2025-10-04 05:20:44.102443501 +0000 UTC m=+2069.956586533" lastFinishedPulling="2025-10-04 05:20:44.552519106 +0000 UTC m=+2070.406662148" observedRunningTime="2025-10-04 05:20:45.226154825 +0000 UTC m=+2071.080297857" watchObservedRunningTime="2025-10-04 05:20:45.24119896 +0000 UTC m=+2071.095342012" Oct 04 05:20:47 crc kubenswrapper[4574]: I1004 05:20:47.214926 4574 generic.go:334] "Generic (PLEG): container finished" podID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerID="fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d" exitCode=0 Oct 04 05:20:47 crc kubenswrapper[4574]: I1004 05:20:47.215132 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerDied","Data":"fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d"} Oct 04 05:20:48 crc kubenswrapper[4574]: I1004 05:20:48.227037 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerStarted","Data":"6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0"} Oct 04 05:20:48 crc kubenswrapper[4574]: I1004 05:20:48.253464 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-67mbj" podStartSLOduration=2.7263600820000002 podStartE2EDuration="6.253390161s" podCreationTimestamp="2025-10-04 05:20:42 +0000 UTC" firstStartedPulling="2025-10-04 05:20:44.180421445 +0000 UTC m=+2070.034564487" lastFinishedPulling="2025-10-04 05:20:47.707451524 +0000 UTC m=+2073.561594566" observedRunningTime="2025-10-04 05:20:48.247759448 +0000 UTC m=+2074.101902510" watchObservedRunningTime="2025-10-04 05:20:48.253390161 +0000 UTC m=+2074.107533203" Oct 04 05:20:52 crc kubenswrapper[4574]: I1004 05:20:52.885924 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:52 crc kubenswrapper[4574]: I1004 05:20:52.886496 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:52 crc kubenswrapper[4574]: I1004 05:20:52.933124 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:53 crc kubenswrapper[4574]: I1004 05:20:53.349090 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:53 crc kubenswrapper[4574]: I1004 05:20:53.426671 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67mbj"] Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.313950 4574 generic.go:334] "Generic (PLEG): container finished" podID="eba11170-e0cf-4e7a-8e9a-771fde74bff1" containerID="b0bf23a324ef5516856fbbfd46cf2ed0d95c7abd865e3c1db58f5861ca06522e" exitCode=0 Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.314411 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-67mbj" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="registry-server" containerID="cri-o://6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0" gracePeriod=2 Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.314043 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" event={"ID":"eba11170-e0cf-4e7a-8e9a-771fde74bff1","Type":"ContainerDied","Data":"b0bf23a324ef5516856fbbfd46cf2ed0d95c7abd865e3c1db58f5861ca06522e"} Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.804365 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.887079 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh67j\" (UniqueName: \"kubernetes.io/projected/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-kube-api-access-lh67j\") pod \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.887532 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-catalog-content\") pod \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.887908 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-utilities\") pod \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\" (UID: \"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c\") " Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.888607 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-utilities" (OuterVolumeSpecName: "utilities") pod "b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" (UID: "b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.895274 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-kube-api-access-lh67j" (OuterVolumeSpecName: "kube-api-access-lh67j") pod "b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" (UID: "b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c"). InnerVolumeSpecName "kube-api-access-lh67j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.943345 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" (UID: "b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.990418 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh67j\" (UniqueName: \"kubernetes.io/projected/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-kube-api-access-lh67j\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.990712 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:55 crc kubenswrapper[4574]: I1004 05:20:55.990821 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.327611 4574 generic.go:334] "Generic (PLEG): container finished" podID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerID="6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0" exitCode=0 Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.327681 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerDied","Data":"6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0"} Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.327768 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mbj" event={"ID":"b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c","Type":"ContainerDied","Data":"8fa202755be9841ca653348d2764ab70373f971ac7b6ab383e16097d82e84f30"} Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.327816 4574 scope.go:117] "RemoveContainer" containerID="6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.329478 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mbj" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.372655 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67mbj"] Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.384659 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-67mbj"] Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.387776 4574 scope.go:117] "RemoveContainer" containerID="fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.410485 4574 scope.go:117] "RemoveContainer" containerID="2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.480476 4574 scope.go:117] "RemoveContainer" containerID="6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0" Oct 04 05:20:56 crc kubenswrapper[4574]: E1004 05:20:56.481082 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0\": container with ID starting with 6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0 not found: ID does not exist" containerID="6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.481116 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0"} err="failed to get container status \"6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0\": rpc error: code = NotFound desc = could not find container \"6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0\": container with ID starting with 6a9b031b48e2cf43670e4cf64470805840c5bf1c97a52201b32159a3b4e7e3e0 not found: ID does not exist" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.481140 4574 scope.go:117] "RemoveContainer" containerID="fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d" Oct 04 05:20:56 crc kubenswrapper[4574]: E1004 05:20:56.481641 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d\": container with ID starting with fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d not found: ID does not exist" containerID="fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.481692 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d"} err="failed to get container status \"fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d\": rpc error: code = NotFound desc = could not find container \"fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d\": container with ID starting with fe09b09c59f59f040f7b53611f3ff572fa712c6f8c65e9dde0d8dc08fd730d3d not found: ID does not exist" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.481726 4574 scope.go:117] "RemoveContainer" containerID="2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55" Oct 04 05:20:56 crc kubenswrapper[4574]: E1004 05:20:56.482788 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55\": container with ID starting with 2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55 not found: ID does not exist" containerID="2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.482813 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55"} err="failed to get container status \"2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55\": rpc error: code = NotFound desc = could not find container \"2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55\": container with ID starting with 2494c0d309ae1407afea69f2fcedfe561fa3a45d9d0d042adb7eb51a0717ed55 not found: ID does not exist" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.753622 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" path="/var/lib/kubelet/pods/b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c/volumes" Oct 04 05:20:56 crc kubenswrapper[4574]: I1004 05:20:56.859367 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.018271 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-ssh-key\") pod \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.018417 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5qwv\" (UniqueName: \"kubernetes.io/projected/eba11170-e0cf-4e7a-8e9a-771fde74bff1-kube-api-access-h5qwv\") pod \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.019578 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-inventory\") pod \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\" (UID: \"eba11170-e0cf-4e7a-8e9a-771fde74bff1\") " Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.024860 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba11170-e0cf-4e7a-8e9a-771fde74bff1-kube-api-access-h5qwv" (OuterVolumeSpecName: "kube-api-access-h5qwv") pod "eba11170-e0cf-4e7a-8e9a-771fde74bff1" (UID: "eba11170-e0cf-4e7a-8e9a-771fde74bff1"). InnerVolumeSpecName "kube-api-access-h5qwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.048106 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-inventory" (OuterVolumeSpecName: "inventory") pod "eba11170-e0cf-4e7a-8e9a-771fde74bff1" (UID: "eba11170-e0cf-4e7a-8e9a-771fde74bff1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.057081 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eba11170-e0cf-4e7a-8e9a-771fde74bff1" (UID: "eba11170-e0cf-4e7a-8e9a-771fde74bff1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.124253 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.124301 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5qwv\" (UniqueName: \"kubernetes.io/projected/eba11170-e0cf-4e7a-8e9a-771fde74bff1-kube-api-access-h5qwv\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.124318 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eba11170-e0cf-4e7a-8e9a-771fde74bff1-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.343381 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" event={"ID":"eba11170-e0cf-4e7a-8e9a-771fde74bff1","Type":"ContainerDied","Data":"2dc8bdc4d42797da1a446b290a8dc816514870e0b2b785aee8e755a016a7ec05"} Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.343439 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.343478 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dc8bdc4d42797da1a446b290a8dc816514870e0b2b785aee8e755a016a7ec05" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435065 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s"] Oct 04 05:20:57 crc kubenswrapper[4574]: E1004 05:20:57.435487 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba11170-e0cf-4e7a-8e9a-771fde74bff1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435504 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba11170-e0cf-4e7a-8e9a-771fde74bff1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:57 crc kubenswrapper[4574]: E1004 05:20:57.435525 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="registry-server" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435531 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="registry-server" Oct 04 05:20:57 crc kubenswrapper[4574]: E1004 05:20:57.435543 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="extract-utilities" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435550 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="extract-utilities" Oct 04 05:20:57 crc kubenswrapper[4574]: E1004 05:20:57.435567 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="extract-content" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435573 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="extract-content" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435750 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19bc8e0-ffb1-4ef4-9fef-5f28df0af49c" containerName="registry-server" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.435772 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba11170-e0cf-4e7a-8e9a-771fde74bff1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.436372 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.439938 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.440717 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.441401 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.442319 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.442474 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.442791 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.444276 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.444839 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.492082 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s"] Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.531763 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.531825 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.531877 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.532078 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.532321 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.532391 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.532520 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.532823 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.533008 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.533196 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.533312 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.533466 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.533571 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.533703 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s747\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-kube-api-access-9s747\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.635877 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637141 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637280 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637421 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637511 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637595 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637679 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637781 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s747\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-kube-api-access-9s747\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.637946 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.638033 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.638155 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.638289 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.638806 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.638923 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.643926 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.644474 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.645318 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.645674 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.647770 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.647895 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.647944 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.648394 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.649393 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.651052 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.653001 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.654077 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.654496 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.662450 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s747\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-kube-api-access-9s747\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:57 crc kubenswrapper[4574]: I1004 05:20:57.752756 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:20:58 crc kubenswrapper[4574]: I1004 05:20:58.361466 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s"] Oct 04 05:20:59 crc kubenswrapper[4574]: I1004 05:20:59.403465 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" event={"ID":"5b869cbb-6227-4391-9faf-2565fc5a4acd","Type":"ContainerStarted","Data":"04e8d8f55f5de3e4d6d42e26b820078dc23e25a83fecdf2f6458fd7bd2c61030"} Oct 04 05:20:59 crc kubenswrapper[4574]: I1004 05:20:59.404080 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" event={"ID":"5b869cbb-6227-4391-9faf-2565fc5a4acd","Type":"ContainerStarted","Data":"862afdd105f2f634d5deeb0f6deccf4f14db000c939d7c17d01466c73acdcf9b"} Oct 04 05:20:59 crc kubenswrapper[4574]: I1004 05:20:59.429027 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" podStartSLOduration=1.976481124 podStartE2EDuration="2.428998269s" podCreationTimestamp="2025-10-04 05:20:57 +0000 UTC" firstStartedPulling="2025-10-04 05:20:58.372453778 +0000 UTC m=+2084.226596810" lastFinishedPulling="2025-10-04 05:20:58.824970913 +0000 UTC m=+2084.679113955" observedRunningTime="2025-10-04 05:20:59.426104226 +0000 UTC m=+2085.280247268" watchObservedRunningTime="2025-10-04 05:20:59.428998269 +0000 UTC m=+2085.283141311" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.139886 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f8q6c"] Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.143937 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.158678 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f8q6c"] Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.291995 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-utilities\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.292512 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.292578 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gq69\" (UniqueName: \"kubernetes.io/projected/1a3e9b96-2773-46a5-99db-81fdce661269-kube-api-access-2gq69\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.394841 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.394947 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gq69\" (UniqueName: \"kubernetes.io/projected/1a3e9b96-2773-46a5-99db-81fdce661269-kube-api-access-2gq69\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.395026 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-utilities\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.395604 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-utilities\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.395808 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.416951 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gq69\" (UniqueName: \"kubernetes.io/projected/1a3e9b96-2773-46a5-99db-81fdce661269-kube-api-access-2gq69\") pod \"redhat-operators-f8q6c\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.466800 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:14 crc kubenswrapper[4574]: I1004 05:21:14.856915 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f8q6c"] Oct 04 05:21:15 crc kubenswrapper[4574]: I1004 05:21:15.558025 4574 generic.go:334] "Generic (PLEG): container finished" podID="1a3e9b96-2773-46a5-99db-81fdce661269" containerID="5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13" exitCode=0 Oct 04 05:21:15 crc kubenswrapper[4574]: I1004 05:21:15.558136 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerDied","Data":"5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13"} Oct 04 05:21:15 crc kubenswrapper[4574]: I1004 05:21:15.558365 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerStarted","Data":"0595804c8628c6e1a54f3dd69dd2a6fd51d433486085e7f4186be438020d6041"} Oct 04 05:21:16 crc kubenswrapper[4574]: I1004 05:21:16.569685 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerStarted","Data":"2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de"} Oct 04 05:21:21 crc kubenswrapper[4574]: I1004 05:21:21.614485 4574 generic.go:334] "Generic (PLEG): container finished" podID="1a3e9b96-2773-46a5-99db-81fdce661269" containerID="2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de" exitCode=0 Oct 04 05:21:21 crc kubenswrapper[4574]: I1004 05:21:21.614675 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerDied","Data":"2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de"} Oct 04 05:21:22 crc kubenswrapper[4574]: I1004 05:21:22.627363 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerStarted","Data":"8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20"} Oct 04 05:21:22 crc kubenswrapper[4574]: I1004 05:21:22.662531 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f8q6c" podStartSLOduration=2.01648696 podStartE2EDuration="8.662498528s" podCreationTimestamp="2025-10-04 05:21:14 +0000 UTC" firstStartedPulling="2025-10-04 05:21:15.559509776 +0000 UTC m=+2101.413652818" lastFinishedPulling="2025-10-04 05:21:22.205521344 +0000 UTC m=+2108.059664386" observedRunningTime="2025-10-04 05:21:22.64806519 +0000 UTC m=+2108.502208232" watchObservedRunningTime="2025-10-04 05:21:22.662498528 +0000 UTC m=+2108.516641570" Oct 04 05:21:24 crc kubenswrapper[4574]: I1004 05:21:24.467839 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:24 crc kubenswrapper[4574]: I1004 05:21:24.468140 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:25 crc kubenswrapper[4574]: I1004 05:21:25.520194 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f8q6c" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="registry-server" probeResult="failure" output=< Oct 04 05:21:25 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:21:25 crc kubenswrapper[4574]: > Oct 04 05:21:34 crc kubenswrapper[4574]: I1004 05:21:34.522847 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:34 crc kubenswrapper[4574]: I1004 05:21:34.576688 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:34 crc kubenswrapper[4574]: I1004 05:21:34.757456 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f8q6c"] Oct 04 05:21:35 crc kubenswrapper[4574]: I1004 05:21:35.737527 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f8q6c" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="registry-server" containerID="cri-o://8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20" gracePeriod=2 Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.161206 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.229612 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gq69\" (UniqueName: \"kubernetes.io/projected/1a3e9b96-2773-46a5-99db-81fdce661269-kube-api-access-2gq69\") pod \"1a3e9b96-2773-46a5-99db-81fdce661269\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.229911 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content\") pod \"1a3e9b96-2773-46a5-99db-81fdce661269\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.230026 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-utilities\") pod \"1a3e9b96-2773-46a5-99db-81fdce661269\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.231266 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-utilities" (OuterVolumeSpecName: "utilities") pod "1a3e9b96-2773-46a5-99db-81fdce661269" (UID: "1a3e9b96-2773-46a5-99db-81fdce661269"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.238006 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3e9b96-2773-46a5-99db-81fdce661269-kube-api-access-2gq69" (OuterVolumeSpecName: "kube-api-access-2gq69") pod "1a3e9b96-2773-46a5-99db-81fdce661269" (UID: "1a3e9b96-2773-46a5-99db-81fdce661269"). InnerVolumeSpecName "kube-api-access-2gq69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.331500 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a3e9b96-2773-46a5-99db-81fdce661269" (UID: "1a3e9b96-2773-46a5-99db-81fdce661269"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.332408 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content\") pod \"1a3e9b96-2773-46a5-99db-81fdce661269\" (UID: \"1a3e9b96-2773-46a5-99db-81fdce661269\") " Oct 04 05:21:36 crc kubenswrapper[4574]: W1004 05:21:36.332495 4574 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1a3e9b96-2773-46a5-99db-81fdce661269/volumes/kubernetes.io~empty-dir/catalog-content Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.332619 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a3e9b96-2773-46a5-99db-81fdce661269" (UID: "1a3e9b96-2773-46a5-99db-81fdce661269"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.333128 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.333288 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a3e9b96-2773-46a5-99db-81fdce661269-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.333364 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gq69\" (UniqueName: \"kubernetes.io/projected/1a3e9b96-2773-46a5-99db-81fdce661269-kube-api-access-2gq69\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.751305 4574 generic.go:334] "Generic (PLEG): container finished" podID="1a3e9b96-2773-46a5-99db-81fdce661269" containerID="8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20" exitCode=0 Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.751361 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerDied","Data":"8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20"} Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.751390 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8q6c" event={"ID":"1a3e9b96-2773-46a5-99db-81fdce661269","Type":"ContainerDied","Data":"0595804c8628c6e1a54f3dd69dd2a6fd51d433486085e7f4186be438020d6041"} Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.751406 4574 scope.go:117] "RemoveContainer" containerID="8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.751524 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f8q6c" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.787024 4574 scope.go:117] "RemoveContainer" containerID="2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.796734 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f8q6c"] Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.813297 4574 scope.go:117] "RemoveContainer" containerID="5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.814061 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f8q6c"] Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.873739 4574 scope.go:117] "RemoveContainer" containerID="8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20" Oct 04 05:21:36 crc kubenswrapper[4574]: E1004 05:21:36.874462 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20\": container with ID starting with 8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20 not found: ID does not exist" containerID="8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.874521 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20"} err="failed to get container status \"8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20\": rpc error: code = NotFound desc = could not find container \"8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20\": container with ID starting with 8ab7129c2e0c45982f80bf793da7c54b9b330c8497b3c75b87c2a61306dadb20 not found: ID does not exist" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.874560 4574 scope.go:117] "RemoveContainer" containerID="2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de" Oct 04 05:21:36 crc kubenswrapper[4574]: E1004 05:21:36.874932 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de\": container with ID starting with 2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de not found: ID does not exist" containerID="2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.874972 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de"} err="failed to get container status \"2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de\": rpc error: code = NotFound desc = could not find container \"2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de\": container with ID starting with 2aa01ed624d2e476ff6bf3e5cb37753abf5790068cc70d366f9b37cce66792de not found: ID does not exist" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.875000 4574 scope.go:117] "RemoveContainer" containerID="5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13" Oct 04 05:21:36 crc kubenswrapper[4574]: E1004 05:21:36.875387 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13\": container with ID starting with 5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13 not found: ID does not exist" containerID="5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13" Oct 04 05:21:36 crc kubenswrapper[4574]: I1004 05:21:36.875691 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13"} err="failed to get container status \"5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13\": rpc error: code = NotFound desc = could not find container \"5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13\": container with ID starting with 5987b3bfc87236baa4c542cf7fdcff993de192dc3d3cf7b31dc6fbeea8783f13 not found: ID does not exist" Oct 04 05:21:38 crc kubenswrapper[4574]: I1004 05:21:38.743612 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" path="/var/lib/kubelet/pods/1a3e9b96-2773-46a5-99db-81fdce661269/volumes" Oct 04 05:21:52 crc kubenswrapper[4574]: I1004 05:21:52.892175 4574 generic.go:334] "Generic (PLEG): container finished" podID="5b869cbb-6227-4391-9faf-2565fc5a4acd" containerID="04e8d8f55f5de3e4d6d42e26b820078dc23e25a83fecdf2f6458fd7bd2c61030" exitCode=0 Oct 04 05:21:52 crc kubenswrapper[4574]: I1004 05:21:52.892345 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" event={"ID":"5b869cbb-6227-4391-9faf-2565fc5a4acd","Type":"ContainerDied","Data":"04e8d8f55f5de3e4d6d42e26b820078dc23e25a83fecdf2f6458fd7bd2c61030"} Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.383988 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.506886 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.509472 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510298 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-bootstrap-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510499 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s747\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-kube-api-access-9s747\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510612 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510706 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-repo-setup-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510797 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-telemetry-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510907 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ovn-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.510989 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-neutron-metadata-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.511380 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-inventory\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.511538 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ssh-key\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.511630 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.511698 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-nova-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.511791 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-libvirt-combined-ca-bundle\") pod \"5b869cbb-6227-4391-9faf-2565fc5a4acd\" (UID: \"5b869cbb-6227-4391-9faf-2565fc5a4acd\") " Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.517761 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.518744 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.519081 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.519381 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.521962 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.522579 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.523359 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.523483 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.524491 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-kube-api-access-9s747" (OuterVolumeSpecName: "kube-api-access-9s747") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "kube-api-access-9s747". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.525865 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.533442 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.533603 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.563086 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.571085 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-inventory" (OuterVolumeSpecName: "inventory") pod "5b869cbb-6227-4391-9faf-2565fc5a4acd" (UID: "5b869cbb-6227-4391-9faf-2565fc5a4acd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615749 4574 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615798 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615813 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615824 4574 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615836 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s747\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-kube-api-access-9s747\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615846 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615861 4574 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615872 4574 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615886 4574 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615894 4574 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615904 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615912 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615921 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5b869cbb-6227-4391-9faf-2565fc5a4acd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.615929 4574 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b869cbb-6227-4391-9faf-2565fc5a4acd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.913445 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" event={"ID":"5b869cbb-6227-4391-9faf-2565fc5a4acd","Type":"ContainerDied","Data":"862afdd105f2f634d5deeb0f6deccf4f14db000c939d7c17d01466c73acdcf9b"} Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.913515 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862afdd105f2f634d5deeb0f6deccf4f14db000c939d7c17d01466c73acdcf9b" Oct 04 05:21:54 crc kubenswrapper[4574]: I1004 05:21:54.913516 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.047547 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t"] Oct 04 05:21:55 crc kubenswrapper[4574]: E1004 05:21:55.048223 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="extract-utilities" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.048329 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="extract-utilities" Oct 04 05:21:55 crc kubenswrapper[4574]: E1004 05:21:55.048409 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="extract-content" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.048462 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="extract-content" Oct 04 05:21:55 crc kubenswrapper[4574]: E1004 05:21:55.048531 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="registry-server" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.048587 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="registry-server" Oct 04 05:21:55 crc kubenswrapper[4574]: E1004 05:21:55.048644 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b869cbb-6227-4391-9faf-2565fc5a4acd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.048708 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b869cbb-6227-4391-9faf-2565fc5a4acd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.049030 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b869cbb-6227-4391-9faf-2565fc5a4acd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.049121 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3e9b96-2773-46a5-99db-81fdce661269" containerName="registry-server" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.050785 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.057573 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t"] Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.057796 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.058030 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.058152 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.058427 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.061610 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.126991 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcsfm\" (UniqueName: \"kubernetes.io/projected/b1cddc5d-210f-4762-9f80-1b055ad2239b-kube-api-access-zcsfm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.127038 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.127127 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.127169 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.127222 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.230516 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcsfm\" (UniqueName: \"kubernetes.io/projected/b1cddc5d-210f-4762-9f80-1b055ad2239b-kube-api-access-zcsfm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.231261 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.231461 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.231551 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.231670 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.232182 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.235994 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.237807 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.238217 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.251709 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcsfm\" (UniqueName: \"kubernetes.io/projected/b1cddc5d-210f-4762-9f80-1b055ad2239b-kube-api-access-zcsfm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pph2t\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.371484 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:21:55 crc kubenswrapper[4574]: I1004 05:21:55.932810 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t"] Oct 04 05:21:56 crc kubenswrapper[4574]: I1004 05:21:56.937173 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" event={"ID":"b1cddc5d-210f-4762-9f80-1b055ad2239b","Type":"ContainerStarted","Data":"2e459a3fc04966bf511678b1004c6b39ba29993ae0485ab0b27a0db60087cf6c"} Oct 04 05:21:56 crc kubenswrapper[4574]: I1004 05:21:56.937529 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" event={"ID":"b1cddc5d-210f-4762-9f80-1b055ad2239b","Type":"ContainerStarted","Data":"15f1507cb2ae9272d0decf28610b9e481db58171aff75bdbaac4b0a3690262b4"} Oct 04 05:21:56 crc kubenswrapper[4574]: I1004 05:21:56.964098 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" podStartSLOduration=1.425672825 podStartE2EDuration="1.964078443s" podCreationTimestamp="2025-10-04 05:21:55 +0000 UTC" firstStartedPulling="2025-10-04 05:21:55.944273595 +0000 UTC m=+2141.798416627" lastFinishedPulling="2025-10-04 05:21:56.482679203 +0000 UTC m=+2142.336822245" observedRunningTime="2025-10-04 05:21:56.952315027 +0000 UTC m=+2142.806458069" watchObservedRunningTime="2025-10-04 05:21:56.964078443 +0000 UTC m=+2142.818221485" Oct 04 05:22:01 crc kubenswrapper[4574]: I1004 05:22:01.881105 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d8l8p"] Oct 04 05:22:01 crc kubenswrapper[4574]: I1004 05:22:01.885492 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:01 crc kubenswrapper[4574]: I1004 05:22:01.909095 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8l8p"] Oct 04 05:22:01 crc kubenswrapper[4574]: I1004 05:22:01.966286 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-catalog-content\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:01 crc kubenswrapper[4574]: I1004 05:22:01.966348 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwpc\" (UniqueName: \"kubernetes.io/projected/316bd8b7-b56e-4319-bda6-233f1fa22ebc-kube-api-access-pvwpc\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:01 crc kubenswrapper[4574]: I1004 05:22:01.966384 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-utilities\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.067608 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-catalog-content\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.067661 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwpc\" (UniqueName: \"kubernetes.io/projected/316bd8b7-b56e-4319-bda6-233f1fa22ebc-kube-api-access-pvwpc\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.067697 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-utilities\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.068123 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-utilities\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.068367 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-catalog-content\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.098754 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwpc\" (UniqueName: \"kubernetes.io/projected/316bd8b7-b56e-4319-bda6-233f1fa22ebc-kube-api-access-pvwpc\") pod \"redhat-marketplace-d8l8p\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.216264 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.727854 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8l8p"] Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.986408 4574 generic.go:334] "Generic (PLEG): container finished" podID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerID="5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b" exitCode=0 Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.986548 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8l8p" event={"ID":"316bd8b7-b56e-4319-bda6-233f1fa22ebc","Type":"ContainerDied","Data":"5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b"} Oct 04 05:22:02 crc kubenswrapper[4574]: I1004 05:22:02.986781 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8l8p" event={"ID":"316bd8b7-b56e-4319-bda6-233f1fa22ebc","Type":"ContainerStarted","Data":"bfc97511ef2c98943da36dffc3a6937afe2e12f5112425e0584370d0e71d1c52"} Oct 04 05:22:05 crc kubenswrapper[4574]: I1004 05:22:05.004887 4574 generic.go:334] "Generic (PLEG): container finished" podID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerID="33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00" exitCode=0 Oct 04 05:22:05 crc kubenswrapper[4574]: I1004 05:22:05.005016 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8l8p" event={"ID":"316bd8b7-b56e-4319-bda6-233f1fa22ebc","Type":"ContainerDied","Data":"33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00"} Oct 04 05:22:06 crc kubenswrapper[4574]: I1004 05:22:06.016230 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8l8p" event={"ID":"316bd8b7-b56e-4319-bda6-233f1fa22ebc","Type":"ContainerStarted","Data":"de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299"} Oct 04 05:22:06 crc kubenswrapper[4574]: I1004 05:22:06.039209 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d8l8p" podStartSLOduration=2.369086822 podStartE2EDuration="5.039194045s" podCreationTimestamp="2025-10-04 05:22:01 +0000 UTC" firstStartedPulling="2025-10-04 05:22:02.987893275 +0000 UTC m=+2148.842036317" lastFinishedPulling="2025-10-04 05:22:05.658000498 +0000 UTC m=+2151.512143540" observedRunningTime="2025-10-04 05:22:06.03338043 +0000 UTC m=+2151.887523472" watchObservedRunningTime="2025-10-04 05:22:06.039194045 +0000 UTC m=+2151.893337077" Oct 04 05:22:12 crc kubenswrapper[4574]: I1004 05:22:12.217372 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:12 crc kubenswrapper[4574]: I1004 05:22:12.218035 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:12 crc kubenswrapper[4574]: I1004 05:22:12.266337 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:13 crc kubenswrapper[4574]: I1004 05:22:13.118360 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:13 crc kubenswrapper[4574]: I1004 05:22:13.173664 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8l8p"] Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.086986 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d8l8p" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="registry-server" containerID="cri-o://de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299" gracePeriod=2 Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.542322 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.729277 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvwpc\" (UniqueName: \"kubernetes.io/projected/316bd8b7-b56e-4319-bda6-233f1fa22ebc-kube-api-access-pvwpc\") pod \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.729450 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-catalog-content\") pod \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.729494 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-utilities\") pod \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\" (UID: \"316bd8b7-b56e-4319-bda6-233f1fa22ebc\") " Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.730501 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-utilities" (OuterVolumeSpecName: "utilities") pod "316bd8b7-b56e-4319-bda6-233f1fa22ebc" (UID: "316bd8b7-b56e-4319-bda6-233f1fa22ebc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.742562 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316bd8b7-b56e-4319-bda6-233f1fa22ebc-kube-api-access-pvwpc" (OuterVolumeSpecName: "kube-api-access-pvwpc") pod "316bd8b7-b56e-4319-bda6-233f1fa22ebc" (UID: "316bd8b7-b56e-4319-bda6-233f1fa22ebc"). InnerVolumeSpecName "kube-api-access-pvwpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.745075 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "316bd8b7-b56e-4319-bda6-233f1fa22ebc" (UID: "316bd8b7-b56e-4319-bda6-233f1fa22ebc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.832832 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvwpc\" (UniqueName: \"kubernetes.io/projected/316bd8b7-b56e-4319-bda6-233f1fa22ebc-kube-api-access-pvwpc\") on node \"crc\" DevicePath \"\"" Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.833263 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:22:15 crc kubenswrapper[4574]: I1004 05:22:15.833344 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/316bd8b7-b56e-4319-bda6-233f1fa22ebc-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.097336 4574 generic.go:334] "Generic (PLEG): container finished" podID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerID="de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299" exitCode=0 Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.097389 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8l8p" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.097406 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8l8p" event={"ID":"316bd8b7-b56e-4319-bda6-233f1fa22ebc","Type":"ContainerDied","Data":"de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299"} Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.099192 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8l8p" event={"ID":"316bd8b7-b56e-4319-bda6-233f1fa22ebc","Type":"ContainerDied","Data":"bfc97511ef2c98943da36dffc3a6937afe2e12f5112425e0584370d0e71d1c52"} Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.099217 4574 scope.go:117] "RemoveContainer" containerID="de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.134016 4574 scope.go:117] "RemoveContainer" containerID="33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.146607 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8l8p"] Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.161248 4574 scope.go:117] "RemoveContainer" containerID="5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.161685 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8l8p"] Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.197561 4574 scope.go:117] "RemoveContainer" containerID="de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299" Oct 04 05:22:16 crc kubenswrapper[4574]: E1004 05:22:16.198100 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299\": container with ID starting with de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299 not found: ID does not exist" containerID="de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.198146 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299"} err="failed to get container status \"de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299\": rpc error: code = NotFound desc = could not find container \"de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299\": container with ID starting with de06c10d14eb6ef79c02b213be29251b1df8fe17493891510ea607d52efcf299 not found: ID does not exist" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.198175 4574 scope.go:117] "RemoveContainer" containerID="33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00" Oct 04 05:22:16 crc kubenswrapper[4574]: E1004 05:22:16.198712 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00\": container with ID starting with 33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00 not found: ID does not exist" containerID="33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.198754 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00"} err="failed to get container status \"33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00\": rpc error: code = NotFound desc = could not find container \"33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00\": container with ID starting with 33d63733b9651fd5700fa08f9b0f09763b219f993db081f23fb8ff6e0c8b3f00 not found: ID does not exist" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.198782 4574 scope.go:117] "RemoveContainer" containerID="5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b" Oct 04 05:22:16 crc kubenswrapper[4574]: E1004 05:22:16.199051 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b\": container with ID starting with 5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b not found: ID does not exist" containerID="5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.199077 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b"} err="failed to get container status \"5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b\": rpc error: code = NotFound desc = could not find container \"5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b\": container with ID starting with 5d5b89e859cf2a892e29793ac84c4c8b51107fa3e81976d9ba920f8303dbcf3b not found: ID does not exist" Oct 04 05:22:16 crc kubenswrapper[4574]: I1004 05:22:16.745084 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" path="/var/lib/kubelet/pods/316bd8b7-b56e-4319-bda6-233f1fa22ebc/volumes" Oct 04 05:22:19 crc kubenswrapper[4574]: I1004 05:22:19.404592 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:22:19 crc kubenswrapper[4574]: I1004 05:22:19.404997 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:22:49 crc kubenswrapper[4574]: I1004 05:22:49.404777 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:22:49 crc kubenswrapper[4574]: I1004 05:22:49.405504 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:23:03 crc kubenswrapper[4574]: I1004 05:23:03.568712 4574 generic.go:334] "Generic (PLEG): container finished" podID="b1cddc5d-210f-4762-9f80-1b055ad2239b" containerID="2e459a3fc04966bf511678b1004c6b39ba29993ae0485ab0b27a0db60087cf6c" exitCode=0 Oct 04 05:23:03 crc kubenswrapper[4574]: I1004 05:23:03.568725 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" event={"ID":"b1cddc5d-210f-4762-9f80-1b055ad2239b","Type":"ContainerDied","Data":"2e459a3fc04966bf511678b1004c6b39ba29993ae0485ab0b27a0db60087cf6c"} Oct 04 05:23:04 crc kubenswrapper[4574]: I1004 05:23:04.954492 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.081924 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovncontroller-config-0\") pod \"b1cddc5d-210f-4762-9f80-1b055ad2239b\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.082320 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovn-combined-ca-bundle\") pod \"b1cddc5d-210f-4762-9f80-1b055ad2239b\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.082411 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ssh-key\") pod \"b1cddc5d-210f-4762-9f80-1b055ad2239b\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.082562 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcsfm\" (UniqueName: \"kubernetes.io/projected/b1cddc5d-210f-4762-9f80-1b055ad2239b-kube-api-access-zcsfm\") pod \"b1cddc5d-210f-4762-9f80-1b055ad2239b\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.082679 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-inventory\") pod \"b1cddc5d-210f-4762-9f80-1b055ad2239b\" (UID: \"b1cddc5d-210f-4762-9f80-1b055ad2239b\") " Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.088741 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b1cddc5d-210f-4762-9f80-1b055ad2239b" (UID: "b1cddc5d-210f-4762-9f80-1b055ad2239b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.099088 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1cddc5d-210f-4762-9f80-1b055ad2239b-kube-api-access-zcsfm" (OuterVolumeSpecName: "kube-api-access-zcsfm") pod "b1cddc5d-210f-4762-9f80-1b055ad2239b" (UID: "b1cddc5d-210f-4762-9f80-1b055ad2239b"). InnerVolumeSpecName "kube-api-access-zcsfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.113439 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b1cddc5d-210f-4762-9f80-1b055ad2239b" (UID: "b1cddc5d-210f-4762-9f80-1b055ad2239b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.116102 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-inventory" (OuterVolumeSpecName: "inventory") pod "b1cddc5d-210f-4762-9f80-1b055ad2239b" (UID: "b1cddc5d-210f-4762-9f80-1b055ad2239b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.119140 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b1cddc5d-210f-4762-9f80-1b055ad2239b" (UID: "b1cddc5d-210f-4762-9f80-1b055ad2239b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.187313 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcsfm\" (UniqueName: \"kubernetes.io/projected/b1cddc5d-210f-4762-9f80-1b055ad2239b-kube-api-access-zcsfm\") on node \"crc\" DevicePath \"\"" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.187350 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.187360 4574 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.187369 4574 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.187376 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1cddc5d-210f-4762-9f80-1b055ad2239b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.584687 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" event={"ID":"b1cddc5d-210f-4762-9f80-1b055ad2239b","Type":"ContainerDied","Data":"15f1507cb2ae9272d0decf28610b9e481db58171aff75bdbaac4b0a3690262b4"} Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.584950 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15f1507cb2ae9272d0decf28610b9e481db58171aff75bdbaac4b0a3690262b4" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.584722 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pph2t" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.680694 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt"] Oct 04 05:23:05 crc kubenswrapper[4574]: E1004 05:23:05.681139 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="extract-content" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.681158 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="extract-content" Oct 04 05:23:05 crc kubenswrapper[4574]: E1004 05:23:05.681183 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="extract-utilities" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.681190 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="extract-utilities" Oct 04 05:23:05 crc kubenswrapper[4574]: E1004 05:23:05.681203 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cddc5d-210f-4762-9f80-1b055ad2239b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.681211 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cddc5d-210f-4762-9f80-1b055ad2239b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 04 05:23:05 crc kubenswrapper[4574]: E1004 05:23:05.681272 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="registry-server" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.681281 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="registry-server" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.681508 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cddc5d-210f-4762-9f80-1b055ad2239b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.681528 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="316bd8b7-b56e-4319-bda6-233f1fa22ebc" containerName="registry-server" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.682328 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.685324 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.685404 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.685512 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.686409 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.686427 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.686659 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.703540 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt"] Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.803171 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r8b\" (UniqueName: \"kubernetes.io/projected/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-kube-api-access-j7r8b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.803917 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.804284 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.805136 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.805310 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.805638 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.916141 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r8b\" (UniqueName: \"kubernetes.io/projected/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-kube-api-access-j7r8b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.916274 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.916396 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.916654 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.916732 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.917214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.923907 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.925846 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.933184 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.939550 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.941468 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:05 crc kubenswrapper[4574]: I1004 05:23:05.941904 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r8b\" (UniqueName: \"kubernetes.io/projected/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-kube-api-access-j7r8b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:06 crc kubenswrapper[4574]: I1004 05:23:06.008368 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:23:06 crc kubenswrapper[4574]: I1004 05:23:06.512427 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt"] Oct 04 05:23:06 crc kubenswrapper[4574]: I1004 05:23:06.594323 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" event={"ID":"658de4d9-d56d-45fd-b0bc-781bbbb30a5e","Type":"ContainerStarted","Data":"fd66ce7a66f4c4b563401803a279444162a8b9b64511049f8c9303f2ca0ea04e"} Oct 04 05:23:07 crc kubenswrapper[4574]: I1004 05:23:07.604649 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" event={"ID":"658de4d9-d56d-45fd-b0bc-781bbbb30a5e","Type":"ContainerStarted","Data":"fbf05ea0fd2332458d46c5dcbdcf5a1d793c9f7deab65deec7636e0fbeee15dc"} Oct 04 05:23:07 crc kubenswrapper[4574]: I1004 05:23:07.630976 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" podStartSLOduration=1.9754801990000002 podStartE2EDuration="2.6309548s" podCreationTimestamp="2025-10-04 05:23:05 +0000 UTC" firstStartedPulling="2025-10-04 05:23:06.512864805 +0000 UTC m=+2212.367007847" lastFinishedPulling="2025-10-04 05:23:07.168339406 +0000 UTC m=+2213.022482448" observedRunningTime="2025-10-04 05:23:07.622293092 +0000 UTC m=+2213.476436134" watchObservedRunningTime="2025-10-04 05:23:07.6309548 +0000 UTC m=+2213.485097842" Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.405369 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.406507 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.406577 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.407375 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.407441 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" gracePeriod=600 Oct 04 05:23:19 crc kubenswrapper[4574]: E1004 05:23:19.548298 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.704887 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" exitCode=0 Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.704949 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea"} Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.705377 4574 scope.go:117] "RemoveContainer" containerID="75806abbab232a33158786e912aa0c12443a8b2653e813b4860c08647deedd1b" Oct 04 05:23:19 crc kubenswrapper[4574]: I1004 05:23:19.706154 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:23:19 crc kubenswrapper[4574]: E1004 05:23:19.706608 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:23:30 crc kubenswrapper[4574]: I1004 05:23:30.734162 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:23:30 crc kubenswrapper[4574]: E1004 05:23:30.735185 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:23:43 crc kubenswrapper[4574]: I1004 05:23:43.733281 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:23:43 crc kubenswrapper[4574]: E1004 05:23:43.736021 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:23:56 crc kubenswrapper[4574]: I1004 05:23:56.734491 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:23:56 crc kubenswrapper[4574]: E1004 05:23:56.736811 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:24:00 crc kubenswrapper[4574]: I1004 05:24:00.048925 4574 generic.go:334] "Generic (PLEG): container finished" podID="658de4d9-d56d-45fd-b0bc-781bbbb30a5e" containerID="fbf05ea0fd2332458d46c5dcbdcf5a1d793c9f7deab65deec7636e0fbeee15dc" exitCode=0 Oct 04 05:24:00 crc kubenswrapper[4574]: I1004 05:24:00.048962 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" event={"ID":"658de4d9-d56d-45fd-b0bc-781bbbb30a5e","Type":"ContainerDied","Data":"fbf05ea0fd2332458d46c5dcbdcf5a1d793c9f7deab65deec7636e0fbeee15dc"} Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.466304 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.628106 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-nova-metadata-neutron-config-0\") pod \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.628155 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-inventory\") pod \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.628191 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-ssh-key\") pod \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.628318 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-metadata-combined-ca-bundle\") pod \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.628382 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7r8b\" (UniqueName: \"kubernetes.io/projected/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-kube-api-access-j7r8b\") pod \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.628431 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\" (UID: \"658de4d9-d56d-45fd-b0bc-781bbbb30a5e\") " Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.633351 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-kube-api-access-j7r8b" (OuterVolumeSpecName: "kube-api-access-j7r8b") pod "658de4d9-d56d-45fd-b0bc-781bbbb30a5e" (UID: "658de4d9-d56d-45fd-b0bc-781bbbb30a5e"). InnerVolumeSpecName "kube-api-access-j7r8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.636953 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "658de4d9-d56d-45fd-b0bc-781bbbb30a5e" (UID: "658de4d9-d56d-45fd-b0bc-781bbbb30a5e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.657176 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "658de4d9-d56d-45fd-b0bc-781bbbb30a5e" (UID: "658de4d9-d56d-45fd-b0bc-781bbbb30a5e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.659626 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "658de4d9-d56d-45fd-b0bc-781bbbb30a5e" (UID: "658de4d9-d56d-45fd-b0bc-781bbbb30a5e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.660727 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-inventory" (OuterVolumeSpecName: "inventory") pod "658de4d9-d56d-45fd-b0bc-781bbbb30a5e" (UID: "658de4d9-d56d-45fd-b0bc-781bbbb30a5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.661055 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "658de4d9-d56d-45fd-b0bc-781bbbb30a5e" (UID: "658de4d9-d56d-45fd-b0bc-781bbbb30a5e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.731433 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7r8b\" (UniqueName: \"kubernetes.io/projected/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-kube-api-access-j7r8b\") on node \"crc\" DevicePath \"\"" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.731464 4574 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.731476 4574 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.731488 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.731498 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:24:01 crc kubenswrapper[4574]: I1004 05:24:01.731506 4574 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658de4d9-d56d-45fd-b0bc-781bbbb30a5e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.084407 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" event={"ID":"658de4d9-d56d-45fd-b0bc-781bbbb30a5e","Type":"ContainerDied","Data":"fd66ce7a66f4c4b563401803a279444162a8b9b64511049f8c9303f2ca0ea04e"} Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.085412 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd66ce7a66f4c4b563401803a279444162a8b9b64511049f8c9303f2ca0ea04e" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.084505 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.244268 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs"] Oct 04 05:24:02 crc kubenswrapper[4574]: E1004 05:24:02.244966 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658de4d9-d56d-45fd-b0bc-781bbbb30a5e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.245086 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="658de4d9-d56d-45fd-b0bc-781bbbb30a5e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.245355 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="658de4d9-d56d-45fd-b0bc-781bbbb30a5e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.246161 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.250724 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.251075 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.251168 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.251546 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.251653 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.264855 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs"] Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.341384 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clhqb\" (UniqueName: \"kubernetes.io/projected/7f92a088-639a-4112-910b-bb2a76600bac-kube-api-access-clhqb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.341465 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.341606 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.341936 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.341968 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.443615 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clhqb\" (UniqueName: \"kubernetes.io/projected/7f92a088-639a-4112-910b-bb2a76600bac-kube-api-access-clhqb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.444178 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.444226 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.444427 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.444452 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.451022 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.451031 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.451067 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.459978 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.466803 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clhqb\" (UniqueName: \"kubernetes.io/projected/7f92a088-639a-4112-910b-bb2a76600bac-kube-api-access-clhqb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qhprs\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:02 crc kubenswrapper[4574]: I1004 05:24:02.566661 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:24:03 crc kubenswrapper[4574]: I1004 05:24:03.072186 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs"] Oct 04 05:24:03 crc kubenswrapper[4574]: I1004 05:24:03.099496 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" event={"ID":"7f92a088-639a-4112-910b-bb2a76600bac","Type":"ContainerStarted","Data":"7c8bfc02c7d84d94d549248495e4d4e7d02c3c2af231cfca56ef246721f5efd1"} Oct 04 05:24:04 crc kubenswrapper[4574]: I1004 05:24:04.109082 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" event={"ID":"7f92a088-639a-4112-910b-bb2a76600bac","Type":"ContainerStarted","Data":"1034d952488f97999d28f2933973f2931ef958f72e16f4ab1eb107398cffe115"} Oct 04 05:24:04 crc kubenswrapper[4574]: I1004 05:24:04.135499 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" podStartSLOduration=1.4721768530000001 podStartE2EDuration="2.135472928s" podCreationTimestamp="2025-10-04 05:24:02 +0000 UTC" firstStartedPulling="2025-10-04 05:24:03.077412858 +0000 UTC m=+2268.931555890" lastFinishedPulling="2025-10-04 05:24:03.740708923 +0000 UTC m=+2269.594851965" observedRunningTime="2025-10-04 05:24:04.125747801 +0000 UTC m=+2269.979890843" watchObservedRunningTime="2025-10-04 05:24:04.135472928 +0000 UTC m=+2269.989615970" Oct 04 05:24:11 crc kubenswrapper[4574]: I1004 05:24:11.733044 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:24:11 crc kubenswrapper[4574]: E1004 05:24:11.733927 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:24:26 crc kubenswrapper[4574]: I1004 05:24:26.733282 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:24:26 crc kubenswrapper[4574]: E1004 05:24:26.734042 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:24:37 crc kubenswrapper[4574]: I1004 05:24:37.733637 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:24:37 crc kubenswrapper[4574]: E1004 05:24:37.734462 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:24:49 crc kubenswrapper[4574]: I1004 05:24:49.734171 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:24:49 crc kubenswrapper[4574]: E1004 05:24:49.734992 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:25:01 crc kubenswrapper[4574]: I1004 05:25:01.733928 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:25:01 crc kubenswrapper[4574]: E1004 05:25:01.735932 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:25:15 crc kubenswrapper[4574]: I1004 05:25:15.733889 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:25:15 crc kubenswrapper[4574]: E1004 05:25:15.736528 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:25:30 crc kubenswrapper[4574]: I1004 05:25:30.732712 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:25:30 crc kubenswrapper[4574]: E1004 05:25:30.733508 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:25:43 crc kubenswrapper[4574]: I1004 05:25:43.734032 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:25:43 crc kubenswrapper[4574]: E1004 05:25:43.734933 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:25:57 crc kubenswrapper[4574]: I1004 05:25:57.733489 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:25:57 crc kubenswrapper[4574]: E1004 05:25:57.734429 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:26:08 crc kubenswrapper[4574]: I1004 05:26:08.733985 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:26:08 crc kubenswrapper[4574]: E1004 05:26:08.734921 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:26:22 crc kubenswrapper[4574]: I1004 05:26:22.732975 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:26:22 crc kubenswrapper[4574]: E1004 05:26:22.733687 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:26:36 crc kubenswrapper[4574]: I1004 05:26:36.733798 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:26:36 crc kubenswrapper[4574]: E1004 05:26:36.734721 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:26:47 crc kubenswrapper[4574]: I1004 05:26:47.733883 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:26:47 crc kubenswrapper[4574]: E1004 05:26:47.734917 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:26:58 crc kubenswrapper[4574]: I1004 05:26:58.733931 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:26:58 crc kubenswrapper[4574]: E1004 05:26:58.735016 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:27:09 crc kubenswrapper[4574]: I1004 05:27:09.733527 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:27:09 crc kubenswrapper[4574]: E1004 05:27:09.734370 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:27:20 crc kubenswrapper[4574]: I1004 05:27:20.733764 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:27:20 crc kubenswrapper[4574]: E1004 05:27:20.734531 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:27:31 crc kubenswrapper[4574]: I1004 05:27:31.733664 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:27:31 crc kubenswrapper[4574]: E1004 05:27:31.734477 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:27:43 crc kubenswrapper[4574]: I1004 05:27:43.733442 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:27:43 crc kubenswrapper[4574]: E1004 05:27:43.734465 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:27:54 crc kubenswrapper[4574]: I1004 05:27:54.738561 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:27:54 crc kubenswrapper[4574]: E1004 05:27:54.739476 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:28:05 crc kubenswrapper[4574]: I1004 05:28:05.734771 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:28:05 crc kubenswrapper[4574]: E1004 05:28:05.736109 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:28:20 crc kubenswrapper[4574]: I1004 05:28:20.733491 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:28:21 crc kubenswrapper[4574]: I1004 05:28:21.320624 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"9081c4a83fb866d34f5bb46858bafeae567e5c4da6462a0dd84649b8d9cefca1"} Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.618571 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-njffb"] Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.625778 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.636021 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njffb"] Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.747465 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-utilities\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.747527 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-catalog-content\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.747780 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brk7l\" (UniqueName: \"kubernetes.io/projected/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-kube-api-access-brk7l\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.850653 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-utilities\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.850734 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-catalog-content\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.850839 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brk7l\" (UniqueName: \"kubernetes.io/projected/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-kube-api-access-brk7l\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.852407 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-utilities\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.852752 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-catalog-content\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.882467 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brk7l\" (UniqueName: \"kubernetes.io/projected/cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd-kube-api-access-brk7l\") pod \"community-operators-njffb\" (UID: \"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd\") " pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:32 crc kubenswrapper[4574]: I1004 05:28:32.962786 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:33 crc kubenswrapper[4574]: W1004 05:28:33.361998 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc61b4c_90a1_434b_b6ae_a845c4fa0bfd.slice/crio-253fbe81bb19ac2f52792e458b2a3b69066e4deb6cbdb1fe94b886d461841691 WatchSource:0}: Error finding container 253fbe81bb19ac2f52792e458b2a3b69066e4deb6cbdb1fe94b886d461841691: Status 404 returned error can't find the container with id 253fbe81bb19ac2f52792e458b2a3b69066e4deb6cbdb1fe94b886d461841691 Oct 04 05:28:33 crc kubenswrapper[4574]: I1004 05:28:33.371017 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njffb"] Oct 04 05:28:33 crc kubenswrapper[4574]: I1004 05:28:33.426068 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njffb" event={"ID":"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd","Type":"ContainerStarted","Data":"253fbe81bb19ac2f52792e458b2a3b69066e4deb6cbdb1fe94b886d461841691"} Oct 04 05:28:34 crc kubenswrapper[4574]: I1004 05:28:34.434801 4574 generic.go:334] "Generic (PLEG): container finished" podID="cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd" containerID="e5037aaa716be8003f0d7a4d6aac83e3b74b7a47b7103e60f2885d8cb3ceb602" exitCode=0 Oct 04 05:28:34 crc kubenswrapper[4574]: I1004 05:28:34.435117 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njffb" event={"ID":"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd","Type":"ContainerDied","Data":"e5037aaa716be8003f0d7a4d6aac83e3b74b7a47b7103e60f2885d8cb3ceb602"} Oct 04 05:28:34 crc kubenswrapper[4574]: I1004 05:28:34.436810 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:28:39 crc kubenswrapper[4574]: I1004 05:28:39.490484 4574 generic.go:334] "Generic (PLEG): container finished" podID="cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd" containerID="dceafaf90cf61b2a6d9afbf4ca5b88b823d2104489f19dd01397f96b22317e23" exitCode=0 Oct 04 05:28:39 crc kubenswrapper[4574]: I1004 05:28:39.491188 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njffb" event={"ID":"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd","Type":"ContainerDied","Data":"dceafaf90cf61b2a6d9afbf4ca5b88b823d2104489f19dd01397f96b22317e23"} Oct 04 05:28:40 crc kubenswrapper[4574]: I1004 05:28:40.502746 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njffb" event={"ID":"cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd","Type":"ContainerStarted","Data":"0f09b118de36ceb41089f5c53e11b86980afa2ec278db8726d429fce7a02a672"} Oct 04 05:28:40 crc kubenswrapper[4574]: I1004 05:28:40.528856 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-njffb" podStartSLOduration=2.757807453 podStartE2EDuration="8.528835688s" podCreationTimestamp="2025-10-04 05:28:32 +0000 UTC" firstStartedPulling="2025-10-04 05:28:34.436624775 +0000 UTC m=+2540.290767817" lastFinishedPulling="2025-10-04 05:28:40.20765302 +0000 UTC m=+2546.061796052" observedRunningTime="2025-10-04 05:28:40.526338146 +0000 UTC m=+2546.380481188" watchObservedRunningTime="2025-10-04 05:28:40.528835688 +0000 UTC m=+2546.382978730" Oct 04 05:28:42 crc kubenswrapper[4574]: I1004 05:28:42.963323 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:42 crc kubenswrapper[4574]: I1004 05:28:42.964899 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:43 crc kubenswrapper[4574]: I1004 05:28:43.016059 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:52 crc kubenswrapper[4574]: I1004 05:28:52.622358 4574 generic.go:334] "Generic (PLEG): container finished" podID="7f92a088-639a-4112-910b-bb2a76600bac" containerID="1034d952488f97999d28f2933973f2931ef958f72e16f4ab1eb107398cffe115" exitCode=0 Oct 04 05:28:52 crc kubenswrapper[4574]: I1004 05:28:52.622424 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" event={"ID":"7f92a088-639a-4112-910b-bb2a76600bac","Type":"ContainerDied","Data":"1034d952488f97999d28f2933973f2931ef958f72e16f4ab1eb107398cffe115"} Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.017854 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-njffb" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.112912 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njffb"] Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.162244 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bj2qb"] Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.162487 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bj2qb" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="registry-server" containerID="cri-o://8299160357dd53ee1cf671632ee0adf17c100fa63f01799f6b532b93d2648fb4" gracePeriod=2 Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.665475 4574 generic.go:334] "Generic (PLEG): container finished" podID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerID="8299160357dd53ee1cf671632ee0adf17c100fa63f01799f6b532b93d2648fb4" exitCode=0 Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.666720 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2qb" event={"ID":"75531a00-f8c5-4f9d-b7e6-b576ab9bd903","Type":"ContainerDied","Data":"8299160357dd53ee1cf671632ee0adf17c100fa63f01799f6b532b93d2648fb4"} Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.666830 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2qb" event={"ID":"75531a00-f8c5-4f9d-b7e6-b576ab9bd903","Type":"ContainerDied","Data":"321de385d0fc95c3ea543dc0db562ec86dfe2cac90344725fe27feb8fc665177"} Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.666899 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="321de385d0fc95c3ea543dc0db562ec86dfe2cac90344725fe27feb8fc665177" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.689221 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.775805 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-utilities\") pod \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.776211 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vchqc\" (UniqueName: \"kubernetes.io/projected/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-kube-api-access-vchqc\") pod \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.776560 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-catalog-content\") pod \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\" (UID: \"75531a00-f8c5-4f9d-b7e6-b576ab9bd903\") " Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.787737 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-utilities" (OuterVolumeSpecName: "utilities") pod "75531a00-f8c5-4f9d-b7e6-b576ab9bd903" (UID: "75531a00-f8c5-4f9d-b7e6-b576ab9bd903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.814482 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-kube-api-access-vchqc" (OuterVolumeSpecName: "kube-api-access-vchqc") pod "75531a00-f8c5-4f9d-b7e6-b576ab9bd903" (UID: "75531a00-f8c5-4f9d-b7e6-b576ab9bd903"). InnerVolumeSpecName "kube-api-access-vchqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.882545 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.882581 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vchqc\" (UniqueName: \"kubernetes.io/projected/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-kube-api-access-vchqc\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.910517 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75531a00-f8c5-4f9d-b7e6-b576ab9bd903" (UID: "75531a00-f8c5-4f9d-b7e6-b576ab9bd903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:28:53 crc kubenswrapper[4574]: I1004 05:28:53.984421 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75531a00-f8c5-4f9d-b7e6-b576ab9bd903-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.281699 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.394928 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clhqb\" (UniqueName: \"kubernetes.io/projected/7f92a088-639a-4112-910b-bb2a76600bac-kube-api-access-clhqb\") pod \"7f92a088-639a-4112-910b-bb2a76600bac\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.395109 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-ssh-key\") pod \"7f92a088-639a-4112-910b-bb2a76600bac\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.395139 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-inventory\") pod \"7f92a088-639a-4112-910b-bb2a76600bac\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.395262 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-combined-ca-bundle\") pod \"7f92a088-639a-4112-910b-bb2a76600bac\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.395304 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-secret-0\") pod \"7f92a088-639a-4112-910b-bb2a76600bac\" (UID: \"7f92a088-639a-4112-910b-bb2a76600bac\") " Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.426217 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f92a088-639a-4112-910b-bb2a76600bac-kube-api-access-clhqb" (OuterVolumeSpecName: "kube-api-access-clhqb") pod "7f92a088-639a-4112-910b-bb2a76600bac" (UID: "7f92a088-639a-4112-910b-bb2a76600bac"). InnerVolumeSpecName "kube-api-access-clhqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.429740 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7f92a088-639a-4112-910b-bb2a76600bac" (UID: "7f92a088-639a-4112-910b-bb2a76600bac"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.481607 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7f92a088-639a-4112-910b-bb2a76600bac" (UID: "7f92a088-639a-4112-910b-bb2a76600bac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.490436 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-inventory" (OuterVolumeSpecName: "inventory") pod "7f92a088-639a-4112-910b-bb2a76600bac" (UID: "7f92a088-639a-4112-910b-bb2a76600bac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.492748 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7f92a088-639a-4112-910b-bb2a76600bac" (UID: "7f92a088-639a-4112-910b-bb2a76600bac"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.496934 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.496960 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.496972 4574 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.496984 4574 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f92a088-639a-4112-910b-bb2a76600bac-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.496993 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clhqb\" (UniqueName: \"kubernetes.io/projected/7f92a088-639a-4112-910b-bb2a76600bac-kube-api-access-clhqb\") on node \"crc\" DevicePath \"\"" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.677529 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" event={"ID":"7f92a088-639a-4112-910b-bb2a76600bac","Type":"ContainerDied","Data":"7c8bfc02c7d84d94d549248495e4d4e7d02c3c2af231cfca56ef246721f5efd1"} Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.677584 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qhprs" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.677613 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c8bfc02c7d84d94d549248495e4d4e7d02c3c2af231cfca56ef246721f5efd1" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.677553 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2qb" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.720631 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bj2qb"] Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.731411 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bj2qb"] Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.749429 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" path="/var/lib/kubelet/pods/75531a00-f8c5-4f9d-b7e6-b576ab9bd903/volumes" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.751384 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm"] Oct 04 05:28:54 crc kubenswrapper[4574]: E1004 05:28:54.751736 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="extract-content" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.751755 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="extract-content" Oct 04 05:28:54 crc kubenswrapper[4574]: E1004 05:28:54.751800 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="extract-utilities" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.751809 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="extract-utilities" Oct 04 05:28:54 crc kubenswrapper[4574]: E1004 05:28:54.751846 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="registry-server" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.751854 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="registry-server" Oct 04 05:28:54 crc kubenswrapper[4574]: E1004 05:28:54.751879 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f92a088-639a-4112-910b-bb2a76600bac" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.751887 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f92a088-639a-4112-910b-bb2a76600bac" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.752101 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="75531a00-f8c5-4f9d-b7e6-b576ab9bd903" containerName="registry-server" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.752135 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f92a088-639a-4112-910b-bb2a76600bac" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.755938 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.760580 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.761012 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.761203 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.761380 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.761758 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.761942 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.763771 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.774497 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm"] Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.802842 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.802935 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803089 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803130 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803158 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803266 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803316 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803346 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.803428 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czxnx\" (UniqueName: \"kubernetes.io/projected/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-kube-api-access-czxnx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904616 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904656 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904677 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904728 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904753 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904772 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904814 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czxnx\" (UniqueName: \"kubernetes.io/projected/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-kube-api-access-czxnx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904849 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.904865 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.905805 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.908664 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.908868 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.909175 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.909223 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.909590 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.910498 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.910518 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:54 crc kubenswrapper[4574]: I1004 05:28:54.923729 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czxnx\" (UniqueName: \"kubernetes.io/projected/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-kube-api-access-czxnx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-th9xm\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:55 crc kubenswrapper[4574]: I1004 05:28:55.076494 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:28:55 crc kubenswrapper[4574]: I1004 05:28:55.715575 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm"] Oct 04 05:28:56 crc kubenswrapper[4574]: I1004 05:28:56.695441 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" event={"ID":"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811","Type":"ContainerStarted","Data":"51dcf2bf244a9d7d4281f785ca48e8ac2c300f6e1deadf7137a7ead7a991c704"} Oct 04 05:28:56 crc kubenswrapper[4574]: I1004 05:28:56.695809 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" event={"ID":"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811","Type":"ContainerStarted","Data":"07d578beb6c414498b56fc7ed1558cd68c69ea0ef56dd4b5fb8f8546c44dbcc3"} Oct 04 05:28:56 crc kubenswrapper[4574]: I1004 05:28:56.714453 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" podStartSLOduration=2.257401171 podStartE2EDuration="2.714433128s" podCreationTimestamp="2025-10-04 05:28:54 +0000 UTC" firstStartedPulling="2025-10-04 05:28:55.720273617 +0000 UTC m=+2561.574416659" lastFinishedPulling="2025-10-04 05:28:56.177305574 +0000 UTC m=+2562.031448616" observedRunningTime="2025-10-04 05:28:56.710135203 +0000 UTC m=+2562.564278245" watchObservedRunningTime="2025-10-04 05:28:56.714433128 +0000 UTC m=+2562.568576170" Oct 04 05:29:24 crc kubenswrapper[4574]: I1004 05:29:24.557422 4574 scope.go:117] "RemoveContainer" containerID="bbcc2d3cd9f91e0d628d22c1f152a3de9c22b8fe3cc9d5b201855c7f7954f0ed" Oct 04 05:29:24 crc kubenswrapper[4574]: I1004 05:29:24.590859 4574 scope.go:117] "RemoveContainer" containerID="676104c9487a7432b1109bacc391a51eda5a146ada91f892dcdb976530c121d3" Oct 04 05:29:24 crc kubenswrapper[4574]: I1004 05:29:24.643699 4574 scope.go:117] "RemoveContainer" containerID="8299160357dd53ee1cf671632ee0adf17c100fa63f01799f6b532b93d2648fb4" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.152542 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps"] Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.154675 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.156954 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.157216 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.177012 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps"] Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.217503 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a698070-81db-4266-adbf-5e55304b3d71-secret-volume\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.217637 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm49w\" (UniqueName: \"kubernetes.io/projected/8a698070-81db-4266-adbf-5e55304b3d71-kube-api-access-zm49w\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.217868 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a698070-81db-4266-adbf-5e55304b3d71-config-volume\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.319651 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a698070-81db-4266-adbf-5e55304b3d71-secret-volume\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.319967 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm49w\" (UniqueName: \"kubernetes.io/projected/8a698070-81db-4266-adbf-5e55304b3d71-kube-api-access-zm49w\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.320179 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a698070-81db-4266-adbf-5e55304b3d71-config-volume\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.321207 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a698070-81db-4266-adbf-5e55304b3d71-config-volume\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.327018 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a698070-81db-4266-adbf-5e55304b3d71-secret-volume\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.338144 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm49w\" (UniqueName: \"kubernetes.io/projected/8a698070-81db-4266-adbf-5e55304b3d71-kube-api-access-zm49w\") pod \"collect-profiles-29325930-zbjps\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.477788 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:00 crc kubenswrapper[4574]: I1004 05:30:00.932604 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps"] Oct 04 05:30:01 crc kubenswrapper[4574]: I1004 05:30:01.260810 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" event={"ID":"8a698070-81db-4266-adbf-5e55304b3d71","Type":"ContainerStarted","Data":"21a8cce2d12c2a2a81c3651d1fd7cbd2d903d0d04f6ba3d492215ca633800906"} Oct 04 05:30:01 crc kubenswrapper[4574]: I1004 05:30:01.261282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" event={"ID":"8a698070-81db-4266-adbf-5e55304b3d71","Type":"ContainerStarted","Data":"8b67ff6c99889f466851d530c07581a9828ea4f12fe0c618d0d962c0327d2750"} Oct 04 05:30:01 crc kubenswrapper[4574]: I1004 05:30:01.291432 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" podStartSLOduration=1.291410613 podStartE2EDuration="1.291410613s" podCreationTimestamp="2025-10-04 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:30:01.288720755 +0000 UTC m=+2627.142863797" watchObservedRunningTime="2025-10-04 05:30:01.291410613 +0000 UTC m=+2627.145553655" Oct 04 05:30:02 crc kubenswrapper[4574]: I1004 05:30:02.278000 4574 generic.go:334] "Generic (PLEG): container finished" podID="8a698070-81db-4266-adbf-5e55304b3d71" containerID="21a8cce2d12c2a2a81c3651d1fd7cbd2d903d0d04f6ba3d492215ca633800906" exitCode=0 Oct 04 05:30:02 crc kubenswrapper[4574]: I1004 05:30:02.278323 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" event={"ID":"8a698070-81db-4266-adbf-5e55304b3d71","Type":"ContainerDied","Data":"21a8cce2d12c2a2a81c3651d1fd7cbd2d903d0d04f6ba3d492215ca633800906"} Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.739993 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.795925 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm49w\" (UniqueName: \"kubernetes.io/projected/8a698070-81db-4266-adbf-5e55304b3d71-kube-api-access-zm49w\") pod \"8a698070-81db-4266-adbf-5e55304b3d71\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.796174 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a698070-81db-4266-adbf-5e55304b3d71-secret-volume\") pod \"8a698070-81db-4266-adbf-5e55304b3d71\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.796282 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a698070-81db-4266-adbf-5e55304b3d71-config-volume\") pod \"8a698070-81db-4266-adbf-5e55304b3d71\" (UID: \"8a698070-81db-4266-adbf-5e55304b3d71\") " Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.804878 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a698070-81db-4266-adbf-5e55304b3d71-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a698070-81db-4266-adbf-5e55304b3d71" (UID: "8a698070-81db-4266-adbf-5e55304b3d71"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.822870 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a698070-81db-4266-adbf-5e55304b3d71-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a698070-81db-4266-adbf-5e55304b3d71" (UID: "8a698070-81db-4266-adbf-5e55304b3d71"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.822985 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a698070-81db-4266-adbf-5e55304b3d71-kube-api-access-zm49w" (OuterVolumeSpecName: "kube-api-access-zm49w") pod "8a698070-81db-4266-adbf-5e55304b3d71" (UID: "8a698070-81db-4266-adbf-5e55304b3d71"). InnerVolumeSpecName "kube-api-access-zm49w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.899319 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm49w\" (UniqueName: \"kubernetes.io/projected/8a698070-81db-4266-adbf-5e55304b3d71-kube-api-access-zm49w\") on node \"crc\" DevicePath \"\"" Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.899360 4574 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a698070-81db-4266-adbf-5e55304b3d71-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:30:03 crc kubenswrapper[4574]: I1004 05:30:03.899370 4574 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a698070-81db-4266-adbf-5e55304b3d71-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:30:04 crc kubenswrapper[4574]: I1004 05:30:04.295753 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" event={"ID":"8a698070-81db-4266-adbf-5e55304b3d71","Type":"ContainerDied","Data":"8b67ff6c99889f466851d530c07581a9828ea4f12fe0c618d0d962c0327d2750"} Oct 04 05:30:04 crc kubenswrapper[4574]: I1004 05:30:04.295790 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325930-zbjps" Oct 04 05:30:04 crc kubenswrapper[4574]: I1004 05:30:04.295803 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b67ff6c99889f466851d530c07581a9828ea4f12fe0c618d0d962c0327d2750" Oct 04 05:30:04 crc kubenswrapper[4574]: I1004 05:30:04.357338 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq"] Oct 04 05:30:04 crc kubenswrapper[4574]: I1004 05:30:04.365999 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325885-zs6xq"] Oct 04 05:30:04 crc kubenswrapper[4574]: I1004 05:30:04.746475 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31ed34c-4127-4040-91fb-c53b671f9ab5" path="/var/lib/kubelet/pods/e31ed34c-4127-4040-91fb-c53b671f9ab5/volumes" Oct 04 05:30:24 crc kubenswrapper[4574]: I1004 05:30:24.721474 4574 scope.go:117] "RemoveContainer" containerID="c2eee3bf7123889097aa522b92786fdf20f9ade94ad1d357194b2e4803971b59" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.444860 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8kds9"] Oct 04 05:30:43 crc kubenswrapper[4574]: E1004 05:30:43.446065 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a698070-81db-4266-adbf-5e55304b3d71" containerName="collect-profiles" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.446082 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a698070-81db-4266-adbf-5e55304b3d71" containerName="collect-profiles" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.446317 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a698070-81db-4266-adbf-5e55304b3d71" containerName="collect-profiles" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.448006 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.460262 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8kds9"] Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.589749 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-utilities\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.589838 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-catalog-content\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.589879 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qhsg\" (UniqueName: \"kubernetes.io/projected/5c33b211-6feb-4cb5-ba6a-42d683e790ca-kube-api-access-7qhsg\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.691649 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-utilities\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.691702 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-catalog-content\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.691737 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qhsg\" (UniqueName: \"kubernetes.io/projected/5c33b211-6feb-4cb5-ba6a-42d683e790ca-kube-api-access-7qhsg\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.692357 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-utilities\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.692610 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-catalog-content\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.715449 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qhsg\" (UniqueName: \"kubernetes.io/projected/5c33b211-6feb-4cb5-ba6a-42d683e790ca-kube-api-access-7qhsg\") pod \"certified-operators-8kds9\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:43 crc kubenswrapper[4574]: I1004 05:30:43.771216 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:44 crc kubenswrapper[4574]: I1004 05:30:44.278940 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8kds9"] Oct 04 05:30:44 crc kubenswrapper[4574]: I1004 05:30:44.681720 4574 generic.go:334] "Generic (PLEG): container finished" podID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerID="2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c" exitCode=0 Oct 04 05:30:44 crc kubenswrapper[4574]: I1004 05:30:44.681787 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerDied","Data":"2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c"} Oct 04 05:30:44 crc kubenswrapper[4574]: I1004 05:30:44.681823 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerStarted","Data":"5d98bf528d614804d7ee82602d5ba4542e256f5615d6214748f318700b19165d"} Oct 04 05:30:45 crc kubenswrapper[4574]: I1004 05:30:45.691896 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerStarted","Data":"5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea"} Oct 04 05:30:48 crc kubenswrapper[4574]: I1004 05:30:48.721917 4574 generic.go:334] "Generic (PLEG): container finished" podID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerID="5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea" exitCode=0 Oct 04 05:30:48 crc kubenswrapper[4574]: I1004 05:30:48.721997 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerDied","Data":"5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea"} Oct 04 05:30:49 crc kubenswrapper[4574]: I1004 05:30:49.404608 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:30:49 crc kubenswrapper[4574]: I1004 05:30:49.404725 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:30:49 crc kubenswrapper[4574]: I1004 05:30:49.735365 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerStarted","Data":"3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc"} Oct 04 05:30:49 crc kubenswrapper[4574]: I1004 05:30:49.765211 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8kds9" podStartSLOduration=2.080498703 podStartE2EDuration="6.765188282s" podCreationTimestamp="2025-10-04 05:30:43 +0000 UTC" firstStartedPulling="2025-10-04 05:30:44.683455081 +0000 UTC m=+2670.537598123" lastFinishedPulling="2025-10-04 05:30:49.36814466 +0000 UTC m=+2675.222287702" observedRunningTime="2025-10-04 05:30:49.754998818 +0000 UTC m=+2675.609141860" watchObservedRunningTime="2025-10-04 05:30:49.765188282 +0000 UTC m=+2675.619331324" Oct 04 05:30:53 crc kubenswrapper[4574]: I1004 05:30:53.771606 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:53 crc kubenswrapper[4574]: I1004 05:30:53.772380 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:30:54 crc kubenswrapper[4574]: I1004 05:30:54.822635 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8kds9" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="registry-server" probeResult="failure" output=< Oct 04 05:30:54 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:30:54 crc kubenswrapper[4574]: > Oct 04 05:31:03 crc kubenswrapper[4574]: I1004 05:31:03.820904 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:31:03 crc kubenswrapper[4574]: I1004 05:31:03.870432 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:31:04 crc kubenswrapper[4574]: I1004 05:31:04.057258 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8kds9"] Oct 04 05:31:04 crc kubenswrapper[4574]: I1004 05:31:04.866969 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8kds9" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="registry-server" containerID="cri-o://3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc" gracePeriod=2 Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.317138 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.445998 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-catalog-content\") pod \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.446197 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-utilities\") pod \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.446389 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qhsg\" (UniqueName: \"kubernetes.io/projected/5c33b211-6feb-4cb5-ba6a-42d683e790ca-kube-api-access-7qhsg\") pod \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\" (UID: \"5c33b211-6feb-4cb5-ba6a-42d683e790ca\") " Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.446839 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-utilities" (OuterVolumeSpecName: "utilities") pod "5c33b211-6feb-4cb5-ba6a-42d683e790ca" (UID: "5c33b211-6feb-4cb5-ba6a-42d683e790ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.453458 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c33b211-6feb-4cb5-ba6a-42d683e790ca-kube-api-access-7qhsg" (OuterVolumeSpecName: "kube-api-access-7qhsg") pod "5c33b211-6feb-4cb5-ba6a-42d683e790ca" (UID: "5c33b211-6feb-4cb5-ba6a-42d683e790ca"). InnerVolumeSpecName "kube-api-access-7qhsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.490224 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c33b211-6feb-4cb5-ba6a-42d683e790ca" (UID: "5c33b211-6feb-4cb5-ba6a-42d683e790ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.549443 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qhsg\" (UniqueName: \"kubernetes.io/projected/5c33b211-6feb-4cb5-ba6a-42d683e790ca-kube-api-access-7qhsg\") on node \"crc\" DevicePath \"\"" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.549481 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.549490 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c33b211-6feb-4cb5-ba6a-42d683e790ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.875784 4574 generic.go:334] "Generic (PLEG): container finished" podID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerID="3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc" exitCode=0 Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.875855 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kds9" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.875855 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerDied","Data":"3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc"} Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.876706 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kds9" event={"ID":"5c33b211-6feb-4cb5-ba6a-42d683e790ca","Type":"ContainerDied","Data":"5d98bf528d614804d7ee82602d5ba4542e256f5615d6214748f318700b19165d"} Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.876727 4574 scope.go:117] "RemoveContainer" containerID="3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.902483 4574 scope.go:117] "RemoveContainer" containerID="5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.915624 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8kds9"] Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.926513 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8kds9"] Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.932447 4574 scope.go:117] "RemoveContainer" containerID="2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.983657 4574 scope.go:117] "RemoveContainer" containerID="3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc" Oct 04 05:31:05 crc kubenswrapper[4574]: E1004 05:31:05.984264 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc\": container with ID starting with 3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc not found: ID does not exist" containerID="3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.984308 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc"} err="failed to get container status \"3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc\": rpc error: code = NotFound desc = could not find container \"3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc\": container with ID starting with 3c8cf38fc2803768da89a208bce290976855cc103dac8b351e08621ca7654ccc not found: ID does not exist" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.984334 4574 scope.go:117] "RemoveContainer" containerID="5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea" Oct 04 05:31:05 crc kubenswrapper[4574]: E1004 05:31:05.984629 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea\": container with ID starting with 5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea not found: ID does not exist" containerID="5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.984655 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea"} err="failed to get container status \"5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea\": rpc error: code = NotFound desc = could not find container \"5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea\": container with ID starting with 5c59953f1be217a2a87d5cd272df5b21371ecec46cd2a40bc08ff5d62aab85ea not found: ID does not exist" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.984673 4574 scope.go:117] "RemoveContainer" containerID="2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c" Oct 04 05:31:05 crc kubenswrapper[4574]: E1004 05:31:05.984893 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c\": container with ID starting with 2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c not found: ID does not exist" containerID="2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c" Oct 04 05:31:05 crc kubenswrapper[4574]: I1004 05:31:05.984910 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c"} err="failed to get container status \"2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c\": rpc error: code = NotFound desc = could not find container \"2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c\": container with ID starting with 2263b1058e59dcbc8c3856ddd10c4a752fd4243d9757939ff5cb4665bb7d465c not found: ID does not exist" Oct 04 05:31:06 crc kubenswrapper[4574]: I1004 05:31:06.745689 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" path="/var/lib/kubelet/pods/5c33b211-6feb-4cb5-ba6a-42d683e790ca/volumes" Oct 04 05:31:19 crc kubenswrapper[4574]: I1004 05:31:19.405209 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:31:19 crc kubenswrapper[4574]: I1004 05:31:19.406101 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.830651 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4qpb7"] Oct 04 05:31:40 crc kubenswrapper[4574]: E1004 05:31:40.836409 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="extract-content" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.836538 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="extract-content" Oct 04 05:31:40 crc kubenswrapper[4574]: E1004 05:31:40.836685 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="extract-utilities" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.836766 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="extract-utilities" Oct 04 05:31:40 crc kubenswrapper[4574]: E1004 05:31:40.836834 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="registry-server" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.836899 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="registry-server" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.837201 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c33b211-6feb-4cb5-ba6a-42d683e790ca" containerName="registry-server" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.839437 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.848950 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qpb7"] Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.943283 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-utilities\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.943413 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:40 crc kubenswrapper[4574]: I1004 05:31:40.943458 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxrl\" (UniqueName: \"kubernetes.io/projected/048ffd60-4588-489b-94aa-919e55db8f33-kube-api-access-xvxrl\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.045988 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.046059 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxrl\" (UniqueName: \"kubernetes.io/projected/048ffd60-4588-489b-94aa-919e55db8f33-kube-api-access-xvxrl\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.046211 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-utilities\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.046774 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-utilities\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.047049 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.075566 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxrl\" (UniqueName: \"kubernetes.io/projected/048ffd60-4588-489b-94aa-919e55db8f33-kube-api-access-xvxrl\") pod \"redhat-operators-4qpb7\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.178335 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:41 crc kubenswrapper[4574]: I1004 05:31:41.693672 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qpb7"] Oct 04 05:31:42 crc kubenswrapper[4574]: I1004 05:31:42.211936 4574 generic.go:334] "Generic (PLEG): container finished" podID="048ffd60-4588-489b-94aa-919e55db8f33" containerID="e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633" exitCode=0 Oct 04 05:31:42 crc kubenswrapper[4574]: I1004 05:31:42.211982 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerDied","Data":"e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633"} Oct 04 05:31:42 crc kubenswrapper[4574]: I1004 05:31:42.212008 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerStarted","Data":"795798b50e8642f2f6550f01de40d19b936ed754120fb429b8959db1d56d5181"} Oct 04 05:31:43 crc kubenswrapper[4574]: I1004 05:31:43.225025 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerStarted","Data":"0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690"} Oct 04 05:31:47 crc kubenswrapper[4574]: I1004 05:31:47.261345 4574 generic.go:334] "Generic (PLEG): container finished" podID="048ffd60-4588-489b-94aa-919e55db8f33" containerID="0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690" exitCode=0 Oct 04 05:31:47 crc kubenswrapper[4574]: I1004 05:31:47.261437 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerDied","Data":"0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690"} Oct 04 05:31:48 crc kubenswrapper[4574]: I1004 05:31:48.273596 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerStarted","Data":"41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94"} Oct 04 05:31:48 crc kubenswrapper[4574]: I1004 05:31:48.292438 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4qpb7" podStartSLOduration=2.762656194 podStartE2EDuration="8.292421491s" podCreationTimestamp="2025-10-04 05:31:40 +0000 UTC" firstStartedPulling="2025-10-04 05:31:42.214266265 +0000 UTC m=+2728.068409307" lastFinishedPulling="2025-10-04 05:31:47.744031562 +0000 UTC m=+2733.598174604" observedRunningTime="2025-10-04 05:31:48.29166559 +0000 UTC m=+2734.145808632" watchObservedRunningTime="2025-10-04 05:31:48.292421491 +0000 UTC m=+2734.146564533" Oct 04 05:31:49 crc kubenswrapper[4574]: I1004 05:31:49.405139 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:31:49 crc kubenswrapper[4574]: I1004 05:31:49.405220 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:31:49 crc kubenswrapper[4574]: I1004 05:31:49.405298 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:31:49 crc kubenswrapper[4574]: I1004 05:31:49.406028 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9081c4a83fb866d34f5bb46858bafeae567e5c4da6462a0dd84649b8d9cefca1"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:31:49 crc kubenswrapper[4574]: I1004 05:31:49.406092 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://9081c4a83fb866d34f5bb46858bafeae567e5c4da6462a0dd84649b8d9cefca1" gracePeriod=600 Oct 04 05:31:50 crc kubenswrapper[4574]: I1004 05:31:50.295644 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="9081c4a83fb866d34f5bb46858bafeae567e5c4da6462a0dd84649b8d9cefca1" exitCode=0 Oct 04 05:31:50 crc kubenswrapper[4574]: I1004 05:31:50.295720 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"9081c4a83fb866d34f5bb46858bafeae567e5c4da6462a0dd84649b8d9cefca1"} Oct 04 05:31:50 crc kubenswrapper[4574]: I1004 05:31:50.296067 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5"} Oct 04 05:31:50 crc kubenswrapper[4574]: I1004 05:31:50.296098 4574 scope.go:117] "RemoveContainer" containerID="09b42f1d257738c89ebf1209f44f9b5a882f292ac5d3f361ed429819390889ea" Oct 04 05:31:51 crc kubenswrapper[4574]: I1004 05:31:51.179699 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:51 crc kubenswrapper[4574]: I1004 05:31:51.180226 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:31:52 crc kubenswrapper[4574]: I1004 05:31:52.245170 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qpb7" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="registry-server" probeResult="failure" output=< Oct 04 05:31:52 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:31:52 crc kubenswrapper[4574]: > Oct 04 05:32:02 crc kubenswrapper[4574]: I1004 05:32:02.225331 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qpb7" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="registry-server" probeResult="failure" output=< Oct 04 05:32:02 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:32:02 crc kubenswrapper[4574]: > Oct 04 05:32:11 crc kubenswrapper[4574]: I1004 05:32:11.223767 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:32:11 crc kubenswrapper[4574]: I1004 05:32:11.288444 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:32:12 crc kubenswrapper[4574]: I1004 05:32:12.033003 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qpb7"] Oct 04 05:32:12 crc kubenswrapper[4574]: I1004 05:32:12.493551 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4qpb7" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="registry-server" containerID="cri-o://41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94" gracePeriod=2 Oct 04 05:32:12 crc kubenswrapper[4574]: I1004 05:32:12.948748 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.109546 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content\") pod \"048ffd60-4588-489b-94aa-919e55db8f33\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.109622 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-utilities\") pod \"048ffd60-4588-489b-94aa-919e55db8f33\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.109678 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvxrl\" (UniqueName: \"kubernetes.io/projected/048ffd60-4588-489b-94aa-919e55db8f33-kube-api-access-xvxrl\") pod \"048ffd60-4588-489b-94aa-919e55db8f33\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.110374 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-utilities" (OuterVolumeSpecName: "utilities") pod "048ffd60-4588-489b-94aa-919e55db8f33" (UID: "048ffd60-4588-489b-94aa-919e55db8f33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.119547 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/048ffd60-4588-489b-94aa-919e55db8f33-kube-api-access-xvxrl" (OuterVolumeSpecName: "kube-api-access-xvxrl") pod "048ffd60-4588-489b-94aa-919e55db8f33" (UID: "048ffd60-4588-489b-94aa-919e55db8f33"). InnerVolumeSpecName "kube-api-access-xvxrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.212390 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "048ffd60-4588-489b-94aa-919e55db8f33" (UID: "048ffd60-4588-489b-94aa-919e55db8f33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.213273 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content\") pod \"048ffd60-4588-489b-94aa-919e55db8f33\" (UID: \"048ffd60-4588-489b-94aa-919e55db8f33\") " Oct 04 05:32:13 crc kubenswrapper[4574]: W1004 05:32:13.213604 4574 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/048ffd60-4588-489b-94aa-919e55db8f33/volumes/kubernetes.io~empty-dir/catalog-content Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.213626 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "048ffd60-4588-489b-94aa-919e55db8f33" (UID: "048ffd60-4588-489b-94aa-919e55db8f33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.218166 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.218205 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/048ffd60-4588-489b-94aa-919e55db8f33-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.218221 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvxrl\" (UniqueName: \"kubernetes.io/projected/048ffd60-4588-489b-94aa-919e55db8f33-kube-api-access-xvxrl\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.506949 4574 generic.go:334] "Generic (PLEG): container finished" podID="048ffd60-4588-489b-94aa-919e55db8f33" containerID="41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94" exitCode=0 Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.507006 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerDied","Data":"41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94"} Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.507077 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qpb7" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.507096 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qpb7" event={"ID":"048ffd60-4588-489b-94aa-919e55db8f33","Type":"ContainerDied","Data":"795798b50e8642f2f6550f01de40d19b936ed754120fb429b8959db1d56d5181"} Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.507128 4574 scope.go:117] "RemoveContainer" containerID="41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.532808 4574 scope.go:117] "RemoveContainer" containerID="0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.569782 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qpb7"] Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.583136 4574 scope.go:117] "RemoveContainer" containerID="e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.585569 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4qpb7"] Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.620647 4574 scope.go:117] "RemoveContainer" containerID="41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94" Oct 04 05:32:13 crc kubenswrapper[4574]: E1004 05:32:13.621628 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94\": container with ID starting with 41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94 not found: ID does not exist" containerID="41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.621674 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94"} err="failed to get container status \"41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94\": rpc error: code = NotFound desc = could not find container \"41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94\": container with ID starting with 41419d5004853871843a983efe3be1d3c6e0dabce3cc1068fb149ddee6e6ef94 not found: ID does not exist" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.621697 4574 scope.go:117] "RemoveContainer" containerID="0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690" Oct 04 05:32:13 crc kubenswrapper[4574]: E1004 05:32:13.622095 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690\": container with ID starting with 0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690 not found: ID does not exist" containerID="0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.622115 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690"} err="failed to get container status \"0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690\": rpc error: code = NotFound desc = could not find container \"0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690\": container with ID starting with 0407476afeaf9057b15e9ba307de29a306bb0d0ca649de55140085aba3c95690 not found: ID does not exist" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.622130 4574 scope.go:117] "RemoveContainer" containerID="e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633" Oct 04 05:32:13 crc kubenswrapper[4574]: E1004 05:32:13.622565 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633\": container with ID starting with e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633 not found: ID does not exist" containerID="e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633" Oct 04 05:32:13 crc kubenswrapper[4574]: I1004 05:32:13.622590 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633"} err="failed to get container status \"e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633\": rpc error: code = NotFound desc = could not find container \"e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633\": container with ID starting with e88f7ec8ba969628137a017ab7bb75f9fc4492456815dfdeb6bee852b13af633 not found: ID does not exist" Oct 04 05:32:14 crc kubenswrapper[4574]: I1004 05:32:14.745194 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="048ffd60-4588-489b-94aa-919e55db8f33" path="/var/lib/kubelet/pods/048ffd60-4588-489b-94aa-919e55db8f33/volumes" Oct 04 05:32:35 crc kubenswrapper[4574]: I1004 05:32:35.711590 4574 generic.go:334] "Generic (PLEG): container finished" podID="d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" containerID="51dcf2bf244a9d7d4281f785ca48e8ac2c300f6e1deadf7137a7ead7a991c704" exitCode=0 Oct 04 05:32:35 crc kubenswrapper[4574]: I1004 05:32:35.711665 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" event={"ID":"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811","Type":"ContainerDied","Data":"51dcf2bf244a9d7d4281f785ca48e8ac2c300f6e1deadf7137a7ead7a991c704"} Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.160812 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309572 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-1\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309646 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-1\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309681 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-ssh-key\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309756 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-extra-config-0\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309852 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-combined-ca-bundle\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309882 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-inventory\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.309969 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-0\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.310029 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-0\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.310061 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czxnx\" (UniqueName: \"kubernetes.io/projected/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-kube-api-access-czxnx\") pod \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\" (UID: \"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811\") " Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.336276 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-kube-api-access-czxnx" (OuterVolumeSpecName: "kube-api-access-czxnx") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "kube-api-access-czxnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.336307 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.343785 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.343815 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.346494 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-inventory" (OuterVolumeSpecName: "inventory") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.359322 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.361377 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.363971 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.377503 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" (UID: "d85707e9-6bd8-4f36-b3c0-d8a0ccc88811"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412321 4574 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412359 4574 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412369 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412380 4574 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412389 4574 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412399 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412410 4574 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412422 4574 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.412433 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czxnx\" (UniqueName: \"kubernetes.io/projected/d85707e9-6bd8-4f36-b3c0-d8a0ccc88811-kube-api-access-czxnx\") on node \"crc\" DevicePath \"\"" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.731539 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" event={"ID":"d85707e9-6bd8-4f36-b3c0-d8a0ccc88811","Type":"ContainerDied","Data":"07d578beb6c414498b56fc7ed1558cd68c69ea0ef56dd4b5fb8f8546c44dbcc3"} Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.732091 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d578beb6c414498b56fc7ed1558cd68c69ea0ef56dd4b5fb8f8546c44dbcc3" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.732174 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-th9xm" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.851725 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw"] Oct 04 05:32:37 crc kubenswrapper[4574]: E1004 05:32:37.852506 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="extract-utilities" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.852633 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="extract-utilities" Oct 04 05:32:37 crc kubenswrapper[4574]: E1004 05:32:37.852757 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.852846 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 04 05:32:37 crc kubenswrapper[4574]: E1004 05:32:37.852950 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="extract-content" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.856408 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="extract-content" Oct 04 05:32:37 crc kubenswrapper[4574]: E1004 05:32:37.856507 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="registry-server" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.856519 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="registry-server" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.856928 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="048ffd60-4588-489b-94aa-919e55db8f33" containerName="registry-server" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.856965 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="d85707e9-6bd8-4f36-b3c0-d8a0ccc88811" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.857818 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.859970 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n9qh4" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.860944 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.861272 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.861775 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.861869 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.864117 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw"] Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922610 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922670 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922728 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnh6\" (UniqueName: \"kubernetes.io/projected/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-kube-api-access-zpnh6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922784 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922819 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922853 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:37 crc kubenswrapper[4574]: I1004 05:32:37.922875 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.023538 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnh6\" (UniqueName: \"kubernetes.io/projected/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-kube-api-access-zpnh6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.023937 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.023978 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.024009 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.024030 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.024063 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.024090 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.028988 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.029804 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.030607 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.030753 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.031355 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.036732 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.050954 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnh6\" (UniqueName: \"kubernetes.io/projected/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-kube-api-access-zpnh6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.191001 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.718836 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw"] Oct 04 05:32:38 crc kubenswrapper[4574]: I1004 05:32:38.773250 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" event={"ID":"8393cfca-67a9-4740-bb68-8a6cfe3f12b4","Type":"ContainerStarted","Data":"7880d4a3e6c536511baa84878195bc5f1c0e2cb48e29735ff0a131132c0ad694"} Oct 04 05:32:39 crc kubenswrapper[4574]: I1004 05:32:39.784600 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" event={"ID":"8393cfca-67a9-4740-bb68-8a6cfe3f12b4","Type":"ContainerStarted","Data":"15348382a6b95c9dcdc751b42285ba8e6c8d2b8cd8a7e5010388e1ecb3852de3"} Oct 04 05:32:39 crc kubenswrapper[4574]: I1004 05:32:39.814963 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" podStartSLOduration=2.35921934 podStartE2EDuration="2.814936899s" podCreationTimestamp="2025-10-04 05:32:37 +0000 UTC" firstStartedPulling="2025-10-04 05:32:38.732085434 +0000 UTC m=+2784.586228476" lastFinishedPulling="2025-10-04 05:32:39.187802993 +0000 UTC m=+2785.041946035" observedRunningTime="2025-10-04 05:32:39.809681747 +0000 UTC m=+2785.663824829" watchObservedRunningTime="2025-10-04 05:32:39.814936899 +0000 UTC m=+2785.669079951" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.428469 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkj"] Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.431483 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.437403 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkj"] Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.515827 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4qnr\" (UniqueName: \"kubernetes.io/projected/340d7898-f701-40f8-919a-8b641cef0d3d-kube-api-access-x4qnr\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.515901 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-catalog-content\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.515971 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-utilities\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.617596 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4qnr\" (UniqueName: \"kubernetes.io/projected/340d7898-f701-40f8-919a-8b641cef0d3d-kube-api-access-x4qnr\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.617694 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-catalog-content\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.617746 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-utilities\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.618397 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-utilities\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.618491 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-catalog-content\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.648468 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4qnr\" (UniqueName: \"kubernetes.io/projected/340d7898-f701-40f8-919a-8b641cef0d3d-kube-api-access-x4qnr\") pod \"redhat-marketplace-8nfkj\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:32:59 crc kubenswrapper[4574]: I1004 05:32:59.752217 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:00 crc kubenswrapper[4574]: I1004 05:33:00.243138 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkj"] Oct 04 05:33:00 crc kubenswrapper[4574]: I1004 05:33:00.957731 4574 generic.go:334] "Generic (PLEG): container finished" podID="340d7898-f701-40f8-919a-8b641cef0d3d" containerID="6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9" exitCode=0 Oct 04 05:33:00 crc kubenswrapper[4574]: I1004 05:33:00.958430 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkj" event={"ID":"340d7898-f701-40f8-919a-8b641cef0d3d","Type":"ContainerDied","Data":"6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9"} Oct 04 05:33:00 crc kubenswrapper[4574]: I1004 05:33:00.959352 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkj" event={"ID":"340d7898-f701-40f8-919a-8b641cef0d3d","Type":"ContainerStarted","Data":"d54168b8477b6106ec7501aedddd27a5cf92b7a0a2915b951f3669b456196b23"} Oct 04 05:33:02 crc kubenswrapper[4574]: I1004 05:33:02.982046 4574 generic.go:334] "Generic (PLEG): container finished" podID="340d7898-f701-40f8-919a-8b641cef0d3d" containerID="cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82" exitCode=0 Oct 04 05:33:02 crc kubenswrapper[4574]: I1004 05:33:02.982116 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkj" event={"ID":"340d7898-f701-40f8-919a-8b641cef0d3d","Type":"ContainerDied","Data":"cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82"} Oct 04 05:33:03 crc kubenswrapper[4574]: I1004 05:33:03.993986 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkj" event={"ID":"340d7898-f701-40f8-919a-8b641cef0d3d","Type":"ContainerStarted","Data":"11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f"} Oct 04 05:33:04 crc kubenswrapper[4574]: I1004 05:33:04.017686 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8nfkj" podStartSLOduration=2.499236568 podStartE2EDuration="5.01766506s" podCreationTimestamp="2025-10-04 05:32:59 +0000 UTC" firstStartedPulling="2025-10-04 05:33:00.964266007 +0000 UTC m=+2806.818409049" lastFinishedPulling="2025-10-04 05:33:03.482694489 +0000 UTC m=+2809.336837541" observedRunningTime="2025-10-04 05:33:04.012049598 +0000 UTC m=+2809.866192640" watchObservedRunningTime="2025-10-04 05:33:04.01766506 +0000 UTC m=+2809.871808102" Oct 04 05:33:09 crc kubenswrapper[4574]: I1004 05:33:09.752817 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:09 crc kubenswrapper[4574]: I1004 05:33:09.753371 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:09 crc kubenswrapper[4574]: I1004 05:33:09.813939 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:10 crc kubenswrapper[4574]: I1004 05:33:10.104863 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:10 crc kubenswrapper[4574]: I1004 05:33:10.159492 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkj"] Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.060083 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8nfkj" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="registry-server" containerID="cri-o://11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f" gracePeriod=2 Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.526542 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.572847 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-catalog-content\") pod \"340d7898-f701-40f8-919a-8b641cef0d3d\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.573066 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-utilities\") pod \"340d7898-f701-40f8-919a-8b641cef0d3d\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.573133 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4qnr\" (UniqueName: \"kubernetes.io/projected/340d7898-f701-40f8-919a-8b641cef0d3d-kube-api-access-x4qnr\") pod \"340d7898-f701-40f8-919a-8b641cef0d3d\" (UID: \"340d7898-f701-40f8-919a-8b641cef0d3d\") " Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.573945 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-utilities" (OuterVolumeSpecName: "utilities") pod "340d7898-f701-40f8-919a-8b641cef0d3d" (UID: "340d7898-f701-40f8-919a-8b641cef0d3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.579809 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340d7898-f701-40f8-919a-8b641cef0d3d-kube-api-access-x4qnr" (OuterVolumeSpecName: "kube-api-access-x4qnr") pod "340d7898-f701-40f8-919a-8b641cef0d3d" (UID: "340d7898-f701-40f8-919a-8b641cef0d3d"). InnerVolumeSpecName "kube-api-access-x4qnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.588790 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "340d7898-f701-40f8-919a-8b641cef0d3d" (UID: "340d7898-f701-40f8-919a-8b641cef0d3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.675617 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.675956 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/340d7898-f701-40f8-919a-8b641cef0d3d-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:33:12 crc kubenswrapper[4574]: I1004 05:33:12.675969 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4qnr\" (UniqueName: \"kubernetes.io/projected/340d7898-f701-40f8-919a-8b641cef0d3d-kube-api-access-x4qnr\") on node \"crc\" DevicePath \"\"" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.073669 4574 generic.go:334] "Generic (PLEG): container finished" podID="340d7898-f701-40f8-919a-8b641cef0d3d" containerID="11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f" exitCode=0 Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.073722 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkj" event={"ID":"340d7898-f701-40f8-919a-8b641cef0d3d","Type":"ContainerDied","Data":"11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f"} Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.073736 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8nfkj" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.073757 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8nfkj" event={"ID":"340d7898-f701-40f8-919a-8b641cef0d3d","Type":"ContainerDied","Data":"d54168b8477b6106ec7501aedddd27a5cf92b7a0a2915b951f3669b456196b23"} Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.073778 4574 scope.go:117] "RemoveContainer" containerID="11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.099841 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkj"] Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.103365 4574 scope.go:117] "RemoveContainer" containerID="cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.114625 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8nfkj"] Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.125759 4574 scope.go:117] "RemoveContainer" containerID="6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.172117 4574 scope.go:117] "RemoveContainer" containerID="11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f" Oct 04 05:33:13 crc kubenswrapper[4574]: E1004 05:33:13.172623 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f\": container with ID starting with 11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f not found: ID does not exist" containerID="11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.172664 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f"} err="failed to get container status \"11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f\": rpc error: code = NotFound desc = could not find container \"11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f\": container with ID starting with 11df200131df8af407110cb70dcb95a330860c080ef1d8afb109611488a0598f not found: ID does not exist" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.172691 4574 scope.go:117] "RemoveContainer" containerID="cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82" Oct 04 05:33:13 crc kubenswrapper[4574]: E1004 05:33:13.173041 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82\": container with ID starting with cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82 not found: ID does not exist" containerID="cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.173061 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82"} err="failed to get container status \"cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82\": rpc error: code = NotFound desc = could not find container \"cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82\": container with ID starting with cb120e3ed27004b61eeeb9ad71f2fd70116edaa08256bc0226183b6e1aa71c82 not found: ID does not exist" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.173073 4574 scope.go:117] "RemoveContainer" containerID="6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9" Oct 04 05:33:13 crc kubenswrapper[4574]: E1004 05:33:13.173406 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9\": container with ID starting with 6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9 not found: ID does not exist" containerID="6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9" Oct 04 05:33:13 crc kubenswrapper[4574]: I1004 05:33:13.173449 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9"} err="failed to get container status \"6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9\": rpc error: code = NotFound desc = could not find container \"6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9\": container with ID starting with 6346fcb61c402ee895fa532b1d13209451832b013bb08d3b68747c6eca8684e9 not found: ID does not exist" Oct 04 05:33:14 crc kubenswrapper[4574]: I1004 05:33:14.744815 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" path="/var/lib/kubelet/pods/340d7898-f701-40f8-919a-8b641cef0d3d/volumes" Oct 04 05:33:49 crc kubenswrapper[4574]: I1004 05:33:49.404839 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:33:49 crc kubenswrapper[4574]: I1004 05:33:49.405486 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:34:19 crc kubenswrapper[4574]: I1004 05:34:19.404659 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:34:19 crc kubenswrapper[4574]: I1004 05:34:19.405261 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.405153 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.405836 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.405886 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.406706 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.406781 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" gracePeriod=600 Oct 04 05:34:49 crc kubenswrapper[4574]: E1004 05:34:49.535620 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.927790 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" exitCode=0 Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.927844 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5"} Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.927881 4574 scope.go:117] "RemoveContainer" containerID="9081c4a83fb866d34f5bb46858bafeae567e5c4da6462a0dd84649b8d9cefca1" Oct 04 05:34:49 crc kubenswrapper[4574]: I1004 05:34:49.928900 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:34:49 crc kubenswrapper[4574]: E1004 05:34:49.929385 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:35:04 crc kubenswrapper[4574]: I1004 05:35:04.740515 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:35:04 crc kubenswrapper[4574]: E1004 05:35:04.741935 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:35:19 crc kubenswrapper[4574]: I1004 05:35:19.734179 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:35:19 crc kubenswrapper[4574]: E1004 05:35:19.735654 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:35:34 crc kubenswrapper[4574]: I1004 05:35:34.745489 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:35:34 crc kubenswrapper[4574]: E1004 05:35:34.746209 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:35:39 crc kubenswrapper[4574]: I1004 05:35:39.368718 4574 generic.go:334] "Generic (PLEG): container finished" podID="8393cfca-67a9-4740-bb68-8a6cfe3f12b4" containerID="15348382a6b95c9dcdc751b42285ba8e6c8d2b8cd8a7e5010388e1ecb3852de3" exitCode=0 Oct 04 05:35:39 crc kubenswrapper[4574]: I1004 05:35:39.368818 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" event={"ID":"8393cfca-67a9-4740-bb68-8a6cfe3f12b4","Type":"ContainerDied","Data":"15348382a6b95c9dcdc751b42285ba8e6c8d2b8cd8a7e5010388e1ecb3852de3"} Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.088104 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.179871 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-inventory\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.179934 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-telemetry-combined-ca-bundle\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.179958 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpnh6\" (UniqueName: \"kubernetes.io/projected/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-kube-api-access-zpnh6\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.179977 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-2\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.180167 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-1\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.180208 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ssh-key\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.180276 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-0\") pod \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\" (UID: \"8393cfca-67a9-4740-bb68-8a6cfe3f12b4\") " Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.185994 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-kube-api-access-zpnh6" (OuterVolumeSpecName: "kube-api-access-zpnh6") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "kube-api-access-zpnh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.198432 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.207522 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-inventory" (OuterVolumeSpecName: "inventory") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.207619 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.210165 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.219512 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.220848 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8393cfca-67a9-4740-bb68-8a6cfe3f12b4" (UID: "8393cfca-67a9-4740-bb68-8a6cfe3f12b4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282559 4574 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282597 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpnh6\" (UniqueName: \"kubernetes.io/projected/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-kube-api-access-zpnh6\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282609 4574 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282623 4574 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282635 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282646 4574 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.282659 4574 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8393cfca-67a9-4740-bb68-8a6cfe3f12b4-inventory\") on node \"crc\" DevicePath \"\"" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.387474 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" event={"ID":"8393cfca-67a9-4740-bb68-8a6cfe3f12b4","Type":"ContainerDied","Data":"7880d4a3e6c536511baa84878195bc5f1c0e2cb48e29735ff0a131132c0ad694"} Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.388189 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7880d4a3e6c536511baa84878195bc5f1c0e2cb48e29735ff0a131132c0ad694" Oct 04 05:35:41 crc kubenswrapper[4574]: I1004 05:35:41.387527 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw" Oct 04 05:35:46 crc kubenswrapper[4574]: I1004 05:35:46.733262 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:35:46 crc kubenswrapper[4574]: E1004 05:35:46.734120 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:36:00 crc kubenswrapper[4574]: I1004 05:36:00.733598 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:36:00 crc kubenswrapper[4574]: E1004 05:36:00.735766 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:36:14 crc kubenswrapper[4574]: I1004 05:36:14.738518 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:36:14 crc kubenswrapper[4574]: E1004 05:36:14.739259 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:36:25 crc kubenswrapper[4574]: I1004 05:36:25.733848 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:36:25 crc kubenswrapper[4574]: E1004 05:36:25.734607 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:36:40 crc kubenswrapper[4574]: I1004 05:36:40.734181 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:36:40 crc kubenswrapper[4574]: E1004 05:36:40.734988 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.476476 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 04 05:36:45 crc kubenswrapper[4574]: E1004 05:36:45.477407 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="extract-utilities" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.477424 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="extract-utilities" Oct 04 05:36:45 crc kubenswrapper[4574]: E1004 05:36:45.477457 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="extract-content" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.477467 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="extract-content" Oct 04 05:36:45 crc kubenswrapper[4574]: E1004 05:36:45.477483 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="registry-server" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.477489 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="registry-server" Oct 04 05:36:45 crc kubenswrapper[4574]: E1004 05:36:45.477507 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8393cfca-67a9-4740-bb68-8a6cfe3f12b4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.477516 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8393cfca-67a9-4740-bb68-8a6cfe3f12b4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.477738 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="340d7898-f701-40f8-919a-8b641cef0d3d" containerName="registry-server" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.477764 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="8393cfca-67a9-4740-bb68-8a6cfe3f12b4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.478546 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.480787 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.484224 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.484635 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.484960 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vqfwn" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.489716 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579275 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579316 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9g4l\" (UniqueName: \"kubernetes.io/projected/20e889e6-41a7-4c36-ac15-8dc429f15aeb-kube-api-access-n9g4l\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579375 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579472 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579539 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579582 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579615 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579636 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-config-data\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.579683 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.680962 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681067 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681107 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9g4l\" (UniqueName: \"kubernetes.io/projected/20e889e6-41a7-4c36-ac15-8dc429f15aeb-kube-api-access-n9g4l\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681163 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681181 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681251 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681290 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681318 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681335 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-config-data\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681615 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.681755 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.683985 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.687227 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.688030 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-config-data\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.692251 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.693852 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.693954 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.703302 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9g4l\" (UniqueName: \"kubernetes.io/projected/20e889e6-41a7-4c36-ac15-8dc429f15aeb-kube-api-access-n9g4l\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.758823 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " pod="openstack/tempest-tests-tempest" Oct 04 05:36:45 crc kubenswrapper[4574]: I1004 05:36:45.803116 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 05:36:46 crc kubenswrapper[4574]: I1004 05:36:46.275783 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 04 05:36:46 crc kubenswrapper[4574]: I1004 05:36:46.281061 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:36:46 crc kubenswrapper[4574]: I1004 05:36:46.936069 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"20e889e6-41a7-4c36-ac15-8dc429f15aeb","Type":"ContainerStarted","Data":"3fe9f725a4437a68c7858d07e33b40c7b98510e3b916e5074c311721813cd53b"} Oct 04 05:36:55 crc kubenswrapper[4574]: I1004 05:36:55.733386 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:36:55 crc kubenswrapper[4574]: E1004 05:36:55.734229 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:37:06 crc kubenswrapper[4574]: I1004 05:37:06.734224 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:37:06 crc kubenswrapper[4574]: E1004 05:37:06.735005 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:37:16 crc kubenswrapper[4574]: E1004 05:37:16.956207 4574 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 04 05:37:16 crc kubenswrapper[4574]: E1004 05:37:16.958364 4574 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9g4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(20e889e6-41a7-4c36-ac15-8dc429f15aeb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 04 05:37:16 crc kubenswrapper[4574]: E1004 05:37:16.959768 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="20e889e6-41a7-4c36-ac15-8dc429f15aeb" Oct 04 05:37:17 crc kubenswrapper[4574]: E1004 05:37:17.247675 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="20e889e6-41a7-4c36-ac15-8dc429f15aeb" Oct 04 05:37:17 crc kubenswrapper[4574]: I1004 05:37:17.733477 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:37:17 crc kubenswrapper[4574]: E1004 05:37:17.733701 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:37:28 crc kubenswrapper[4574]: I1004 05:37:28.180652 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 04 05:37:29 crc kubenswrapper[4574]: I1004 05:37:29.733845 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:37:29 crc kubenswrapper[4574]: E1004 05:37:29.735357 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:37:30 crc kubenswrapper[4574]: I1004 05:37:30.353727 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"20e889e6-41a7-4c36-ac15-8dc429f15aeb","Type":"ContainerStarted","Data":"dede8ccbf220bab9dc58914eea60bfab57dd356e65486cace2e9ea0538c58709"} Oct 04 05:37:42 crc kubenswrapper[4574]: I1004 05:37:42.733325 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:37:42 crc kubenswrapper[4574]: E1004 05:37:42.734077 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:37:56 crc kubenswrapper[4574]: I1004 05:37:56.734036 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:37:56 crc kubenswrapper[4574]: E1004 05:37:56.735427 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:38:09 crc kubenswrapper[4574]: I1004 05:38:09.733111 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:38:09 crc kubenswrapper[4574]: E1004 05:38:09.733878 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:38:23 crc kubenswrapper[4574]: I1004 05:38:23.733872 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:38:23 crc kubenswrapper[4574]: E1004 05:38:23.734711 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.064957 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=67.167604954 podStartE2EDuration="1m49.064934659s" podCreationTimestamp="2025-10-04 05:36:44 +0000 UTC" firstStartedPulling="2025-10-04 05:36:46.280860726 +0000 UTC m=+3032.135003768" lastFinishedPulling="2025-10-04 05:37:28.178190431 +0000 UTC m=+3074.032333473" observedRunningTime="2025-10-04 05:37:30.372122984 +0000 UTC m=+3076.226266026" watchObservedRunningTime="2025-10-04 05:38:33.064934659 +0000 UTC m=+3138.919077721" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.070074 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f67zx"] Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.072478 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.083387 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f67zx"] Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.156714 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-catalog-content\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.156774 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4kv6\" (UniqueName: \"kubernetes.io/projected/939a9c07-796c-4f97-a87e-5df91e06659b-kube-api-access-v4kv6\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.156868 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-utilities\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.258675 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-catalog-content\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.258719 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4kv6\" (UniqueName: \"kubernetes.io/projected/939a9c07-796c-4f97-a87e-5df91e06659b-kube-api-access-v4kv6\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.258777 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-utilities\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.259291 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-utilities\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.259299 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-catalog-content\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.290174 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4kv6\" (UniqueName: \"kubernetes.io/projected/939a9c07-796c-4f97-a87e-5df91e06659b-kube-api-access-v4kv6\") pod \"community-operators-f67zx\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.392521 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:33 crc kubenswrapper[4574]: I1004 05:38:33.972259 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f67zx"] Oct 04 05:38:34 crc kubenswrapper[4574]: I1004 05:38:34.979353 4574 generic.go:334] "Generic (PLEG): container finished" podID="939a9c07-796c-4f97-a87e-5df91e06659b" containerID="021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990" exitCode=0 Oct 04 05:38:34 crc kubenswrapper[4574]: I1004 05:38:34.979549 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerDied","Data":"021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990"} Oct 04 05:38:34 crc kubenswrapper[4574]: I1004 05:38:34.979932 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerStarted","Data":"5743e4475abe7d4ace4f5eae0846a6f04547c6b763f883cfa35fbe809cf9e8cc"} Oct 04 05:38:35 crc kubenswrapper[4574]: I1004 05:38:35.733551 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:38:35 crc kubenswrapper[4574]: E1004 05:38:35.734031 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:38:35 crc kubenswrapper[4574]: I1004 05:38:35.990795 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerStarted","Data":"c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473"} Oct 04 05:38:37 crc kubenswrapper[4574]: I1004 05:38:37.020489 4574 generic.go:334] "Generic (PLEG): container finished" podID="939a9c07-796c-4f97-a87e-5df91e06659b" containerID="c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473" exitCode=0 Oct 04 05:38:37 crc kubenswrapper[4574]: I1004 05:38:37.024513 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerDied","Data":"c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473"} Oct 04 05:38:38 crc kubenswrapper[4574]: I1004 05:38:38.052667 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerStarted","Data":"5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c"} Oct 04 05:38:38 crc kubenswrapper[4574]: I1004 05:38:38.079655 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f67zx" podStartSLOduration=2.451193194 podStartE2EDuration="5.079633831s" podCreationTimestamp="2025-10-04 05:38:33 +0000 UTC" firstStartedPulling="2025-10-04 05:38:34.981662438 +0000 UTC m=+3140.835805480" lastFinishedPulling="2025-10-04 05:38:37.610103075 +0000 UTC m=+3143.464246117" observedRunningTime="2025-10-04 05:38:38.075872202 +0000 UTC m=+3143.930015234" watchObservedRunningTime="2025-10-04 05:38:38.079633831 +0000 UTC m=+3143.933776873" Oct 04 05:38:43 crc kubenswrapper[4574]: I1004 05:38:43.394003 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:43 crc kubenswrapper[4574]: I1004 05:38:43.395507 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:43 crc kubenswrapper[4574]: I1004 05:38:43.444609 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:44 crc kubenswrapper[4574]: I1004 05:38:44.153252 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:44 crc kubenswrapper[4574]: I1004 05:38:44.458204 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f67zx"] Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.122184 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f67zx" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="registry-server" containerID="cri-o://5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c" gracePeriod=2 Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.722071 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.880279 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-utilities\") pod \"939a9c07-796c-4f97-a87e-5df91e06659b\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.880386 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4kv6\" (UniqueName: \"kubernetes.io/projected/939a9c07-796c-4f97-a87e-5df91e06659b-kube-api-access-v4kv6\") pod \"939a9c07-796c-4f97-a87e-5df91e06659b\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.880486 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-catalog-content\") pod \"939a9c07-796c-4f97-a87e-5df91e06659b\" (UID: \"939a9c07-796c-4f97-a87e-5df91e06659b\") " Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.882512 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-utilities" (OuterVolumeSpecName: "utilities") pod "939a9c07-796c-4f97-a87e-5df91e06659b" (UID: "939a9c07-796c-4f97-a87e-5df91e06659b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.891806 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939a9c07-796c-4f97-a87e-5df91e06659b-kube-api-access-v4kv6" (OuterVolumeSpecName: "kube-api-access-v4kv6") pod "939a9c07-796c-4f97-a87e-5df91e06659b" (UID: "939a9c07-796c-4f97-a87e-5df91e06659b"). InnerVolumeSpecName "kube-api-access-v4kv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.943845 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "939a9c07-796c-4f97-a87e-5df91e06659b" (UID: "939a9c07-796c-4f97-a87e-5df91e06659b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.986495 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.986614 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/939a9c07-796c-4f97-a87e-5df91e06659b-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:38:46 crc kubenswrapper[4574]: I1004 05:38:46.986630 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4kv6\" (UniqueName: \"kubernetes.io/projected/939a9c07-796c-4f97-a87e-5df91e06659b-kube-api-access-v4kv6\") on node \"crc\" DevicePath \"\"" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.132606 4574 generic.go:334] "Generic (PLEG): container finished" podID="939a9c07-796c-4f97-a87e-5df91e06659b" containerID="5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c" exitCode=0 Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.132654 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerDied","Data":"5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c"} Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.132688 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67zx" event={"ID":"939a9c07-796c-4f97-a87e-5df91e06659b","Type":"ContainerDied","Data":"5743e4475abe7d4ace4f5eae0846a6f04547c6b763f883cfa35fbe809cf9e8cc"} Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.132685 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67zx" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.132764 4574 scope.go:117] "RemoveContainer" containerID="5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.166398 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f67zx"] Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.171106 4574 scope.go:117] "RemoveContainer" containerID="c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.177242 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f67zx"] Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.204335 4574 scope.go:117] "RemoveContainer" containerID="021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.247812 4574 scope.go:117] "RemoveContainer" containerID="5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c" Oct 04 05:38:47 crc kubenswrapper[4574]: E1004 05:38:47.248295 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c\": container with ID starting with 5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c not found: ID does not exist" containerID="5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.248344 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c"} err="failed to get container status \"5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c\": rpc error: code = NotFound desc = could not find container \"5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c\": container with ID starting with 5742bb6329d5dfe25fe5a4762c635dee48b6a24faa909b361cf52a9bd4e47d7c not found: ID does not exist" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.248374 4574 scope.go:117] "RemoveContainer" containerID="c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473" Oct 04 05:38:47 crc kubenswrapper[4574]: E1004 05:38:47.249214 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473\": container with ID starting with c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473 not found: ID does not exist" containerID="c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.249274 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473"} err="failed to get container status \"c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473\": rpc error: code = NotFound desc = could not find container \"c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473\": container with ID starting with c8d56f3e09bb55c4757f975a14a8d3e205dd81131bc2be9e03411173aced9473 not found: ID does not exist" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.249299 4574 scope.go:117] "RemoveContainer" containerID="021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990" Oct 04 05:38:47 crc kubenswrapper[4574]: E1004 05:38:47.249702 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990\": container with ID starting with 021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990 not found: ID does not exist" containerID="021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990" Oct 04 05:38:47 crc kubenswrapper[4574]: I1004 05:38:47.249732 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990"} err="failed to get container status \"021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990\": rpc error: code = NotFound desc = could not find container \"021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990\": container with ID starting with 021cfad34548d88dbe064a20faa18daa1e6353759dadbcec97948707fa38b990 not found: ID does not exist" Oct 04 05:38:48 crc kubenswrapper[4574]: I1004 05:38:48.733436 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:38:48 crc kubenswrapper[4574]: E1004 05:38:48.733967 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:38:48 crc kubenswrapper[4574]: I1004 05:38:48.756808 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" path="/var/lib/kubelet/pods/939a9c07-796c-4f97-a87e-5df91e06659b/volumes" Oct 04 05:39:00 crc kubenswrapper[4574]: I1004 05:39:00.734109 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:39:00 crc kubenswrapper[4574]: E1004 05:39:00.735439 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:39:13 crc kubenswrapper[4574]: I1004 05:39:13.733211 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:39:13 crc kubenswrapper[4574]: E1004 05:39:13.734005 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:39:28 crc kubenswrapper[4574]: I1004 05:39:28.734628 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:39:28 crc kubenswrapper[4574]: E1004 05:39:28.735428 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:39:42 crc kubenswrapper[4574]: I1004 05:39:42.733742 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:39:42 crc kubenswrapper[4574]: E1004 05:39:42.734485 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:39:50 crc kubenswrapper[4574]: I1004 05:39:50.684746 4574 generic.go:334] "Generic (PLEG): container finished" podID="20e889e6-41a7-4c36-ac15-8dc429f15aeb" containerID="dede8ccbf220bab9dc58914eea60bfab57dd356e65486cace2e9ea0538c58709" exitCode=0 Oct 04 05:39:50 crc kubenswrapper[4574]: I1004 05:39:50.684833 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"20e889e6-41a7-4c36-ac15-8dc429f15aeb","Type":"ContainerDied","Data":"dede8ccbf220bab9dc58914eea60bfab57dd356e65486cace2e9ea0538c58709"} Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.258851 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423401 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ca-certs\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423462 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423513 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423582 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config-secret\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423684 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ssh-key\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423723 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-config-data\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423773 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-temporary\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423836 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-workdir\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.423883 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9g4l\" (UniqueName: \"kubernetes.io/projected/20e889e6-41a7-4c36-ac15-8dc429f15aeb-kube-api-access-n9g4l\") pod \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\" (UID: \"20e889e6-41a7-4c36-ac15-8dc429f15aeb\") " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.425328 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.425821 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-config-data" (OuterVolumeSpecName: "config-data") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.429515 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e889e6-41a7-4c36-ac15-8dc429f15aeb-kube-api-access-n9g4l" (OuterVolumeSpecName: "kube-api-access-n9g4l") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "kube-api-access-n9g4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.431090 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.441396 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.459857 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.462537 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.463438 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.484964 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "20e889e6-41a7-4c36-ac15-8dc429f15aeb" (UID: "20e889e6-41a7-4c36-ac15-8dc429f15aeb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525806 4574 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525830 4574 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-config-data\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525841 4574 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525851 4574 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/20e889e6-41a7-4c36-ac15-8dc429f15aeb-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525860 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9g4l\" (UniqueName: \"kubernetes.io/projected/20e889e6-41a7-4c36-ac15-8dc429f15aeb-kube-api-access-n9g4l\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525870 4574 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525896 4574 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525905 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.525913 4574 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/20e889e6-41a7-4c36-ac15-8dc429f15aeb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.547614 4574 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.628414 4574 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.704595 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"20e889e6-41a7-4c36-ac15-8dc429f15aeb","Type":"ContainerDied","Data":"3fe9f725a4437a68c7858d07e33b40c7b98510e3b916e5074c311721813cd53b"} Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.704638 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fe9f725a4437a68c7858d07e33b40c7b98510e3b916e5074c311721813cd53b" Oct 04 05:39:52 crc kubenswrapper[4574]: I1004 05:39:52.704706 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 04 05:39:54 crc kubenswrapper[4574]: I1004 05:39:54.741606 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:39:55 crc kubenswrapper[4574]: I1004 05:39:55.731428 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"d3c969c5c34210d2443513e1094b552fde70ff1b5cf8839e3294ccaf892d01bd"} Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.542498 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 04 05:40:00 crc kubenswrapper[4574]: E1004 05:40:00.543537 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="registry-server" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.543554 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="registry-server" Oct 04 05:40:00 crc kubenswrapper[4574]: E1004 05:40:00.543570 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e889e6-41a7-4c36-ac15-8dc429f15aeb" containerName="tempest-tests-tempest-tests-runner" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.543578 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e889e6-41a7-4c36-ac15-8dc429f15aeb" containerName="tempest-tests-tempest-tests-runner" Oct 04 05:40:00 crc kubenswrapper[4574]: E1004 05:40:00.543596 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="extract-utilities" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.543602 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="extract-utilities" Oct 04 05:40:00 crc kubenswrapper[4574]: E1004 05:40:00.543609 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="extract-content" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.543614 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="extract-content" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.543806 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="939a9c07-796c-4f97-a87e-5df91e06659b" containerName="registry-server" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.543815 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e889e6-41a7-4c36-ac15-8dc429f15aeb" containerName="tempest-tests-tempest-tests-runner" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.544624 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.552151 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.558256 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vqfwn" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.678432 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qkg\" (UniqueName: \"kubernetes.io/projected/f0d647c3-a19e-44ce-9e3e-be13cf6e9586-kube-api-access-h4qkg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.678692 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.780950 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.781018 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qkg\" (UniqueName: \"kubernetes.io/projected/f0d647c3-a19e-44ce-9e3e-be13cf6e9586-kube-api-access-h4qkg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.781459 4574 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.801066 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qkg\" (UniqueName: \"kubernetes.io/projected/f0d647c3-a19e-44ce-9e3e-be13cf6e9586-kube-api-access-h4qkg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.811275 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d647c3-a19e-44ce-9e3e-be13cf6e9586\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:00 crc kubenswrapper[4574]: I1004 05:40:00.866969 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 04 05:40:01 crc kubenswrapper[4574]: I1004 05:40:01.393682 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 04 05:40:01 crc kubenswrapper[4574]: I1004 05:40:01.785433 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f0d647c3-a19e-44ce-9e3e-be13cf6e9586","Type":"ContainerStarted","Data":"71f643c8391a3c4331c785186c245e20077fadd2a633c0d5a714af8a92c7a0fa"} Oct 04 05:40:02 crc kubenswrapper[4574]: I1004 05:40:02.797189 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f0d647c3-a19e-44ce-9e3e-be13cf6e9586","Type":"ContainerStarted","Data":"b70d99caf4c3e8cda8242dbff096103f3d8ed0c08a9a2bd2b0fca69e93ed70df"} Oct 04 05:40:02 crc kubenswrapper[4574]: I1004 05:40:02.824104 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.866526319 podStartE2EDuration="2.824041495s" podCreationTimestamp="2025-10-04 05:40:00 +0000 UTC" firstStartedPulling="2025-10-04 05:40:01.398972811 +0000 UTC m=+3227.253115853" lastFinishedPulling="2025-10-04 05:40:02.356487987 +0000 UTC m=+3228.210631029" observedRunningTime="2025-10-04 05:40:02.815343874 +0000 UTC m=+3228.669486946" watchObservedRunningTime="2025-10-04 05:40:02.824041495 +0000 UTC m=+3228.678184537" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.315399 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7hjf/must-gather-25mpc"] Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.318163 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.324816 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v7hjf"/"openshift-service-ca.crt" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.324816 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v7hjf"/"default-dockercfg-95hbw" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.324987 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v7hjf"/"kube-root-ca.crt" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.342692 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v7hjf/must-gather-25mpc"] Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.482620 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccbf02b4-3afc-447e-a4e1-6b784ebff333-must-gather-output\") pod \"must-gather-25mpc\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.482707 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqgqq\" (UniqueName: \"kubernetes.io/projected/ccbf02b4-3afc-447e-a4e1-6b784ebff333-kube-api-access-wqgqq\") pod \"must-gather-25mpc\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.584562 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccbf02b4-3afc-447e-a4e1-6b784ebff333-must-gather-output\") pod \"must-gather-25mpc\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.584647 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqgqq\" (UniqueName: \"kubernetes.io/projected/ccbf02b4-3afc-447e-a4e1-6b784ebff333-kube-api-access-wqgqq\") pod \"must-gather-25mpc\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.585050 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccbf02b4-3afc-447e-a4e1-6b784ebff333-must-gather-output\") pod \"must-gather-25mpc\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.609827 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqgqq\" (UniqueName: \"kubernetes.io/projected/ccbf02b4-3afc-447e-a4e1-6b784ebff333-kube-api-access-wqgqq\") pod \"must-gather-25mpc\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:21 crc kubenswrapper[4574]: I1004 05:40:21.636626 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:40:22 crc kubenswrapper[4574]: I1004 05:40:22.127657 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v7hjf/must-gather-25mpc"] Oct 04 05:40:22 crc kubenswrapper[4574]: W1004 05:40:22.144992 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccbf02b4_3afc_447e_a4e1_6b784ebff333.slice/crio-cc9c69edec276497a7d710b2004b7e9e0953a64f46ecd4600f1f1bdf9b6204fc WatchSource:0}: Error finding container cc9c69edec276497a7d710b2004b7e9e0953a64f46ecd4600f1f1bdf9b6204fc: Status 404 returned error can't find the container with id cc9c69edec276497a7d710b2004b7e9e0953a64f46ecd4600f1f1bdf9b6204fc Oct 04 05:40:22 crc kubenswrapper[4574]: I1004 05:40:22.972409 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/must-gather-25mpc" event={"ID":"ccbf02b4-3afc-447e-a4e1-6b784ebff333","Type":"ContainerStarted","Data":"cc9c69edec276497a7d710b2004b7e9e0953a64f46ecd4600f1f1bdf9b6204fc"} Oct 04 05:40:27 crc kubenswrapper[4574]: I1004 05:40:27.006080 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/must-gather-25mpc" event={"ID":"ccbf02b4-3afc-447e-a4e1-6b784ebff333","Type":"ContainerStarted","Data":"fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d"} Oct 04 05:40:27 crc kubenswrapper[4574]: I1004 05:40:27.006642 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/must-gather-25mpc" event={"ID":"ccbf02b4-3afc-447e-a4e1-6b784ebff333","Type":"ContainerStarted","Data":"49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4"} Oct 04 05:40:27 crc kubenswrapper[4574]: I1004 05:40:27.018904 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v7hjf/must-gather-25mpc" podStartSLOduration=2.000701701 podStartE2EDuration="6.01889139s" podCreationTimestamp="2025-10-04 05:40:21 +0000 UTC" firstStartedPulling="2025-10-04 05:40:22.152565498 +0000 UTC m=+3248.006708530" lastFinishedPulling="2025-10-04 05:40:26.170755177 +0000 UTC m=+3252.024898219" observedRunningTime="2025-10-04 05:40:27.017837039 +0000 UTC m=+3252.871980081" watchObservedRunningTime="2025-10-04 05:40:27.01889139 +0000 UTC m=+3252.873034432" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.506444 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-78pt2"] Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.508221 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.646662 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a93ef4d1-6946-43d0-aa80-63cc0f180346-host\") pod \"crc-debug-78pt2\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.646770 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kjp\" (UniqueName: \"kubernetes.io/projected/a93ef4d1-6946-43d0-aa80-63cc0f180346-kube-api-access-c2kjp\") pod \"crc-debug-78pt2\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.749569 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a93ef4d1-6946-43d0-aa80-63cc0f180346-host\") pod \"crc-debug-78pt2\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.749950 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kjp\" (UniqueName: \"kubernetes.io/projected/a93ef4d1-6946-43d0-aa80-63cc0f180346-kube-api-access-c2kjp\") pod \"crc-debug-78pt2\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.749695 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a93ef4d1-6946-43d0-aa80-63cc0f180346-host\") pod \"crc-debug-78pt2\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.779543 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kjp\" (UniqueName: \"kubernetes.io/projected/a93ef4d1-6946-43d0-aa80-63cc0f180346-kube-api-access-c2kjp\") pod \"crc-debug-78pt2\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: I1004 05:40:30.826301 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:40:30 crc kubenswrapper[4574]: W1004 05:40:30.858128 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda93ef4d1_6946_43d0_aa80_63cc0f180346.slice/crio-0c805b2225b0abab3611bc30bcd981f7af06602c5a436a659557430c1fac214d WatchSource:0}: Error finding container 0c805b2225b0abab3611bc30bcd981f7af06602c5a436a659557430c1fac214d: Status 404 returned error can't find the container with id 0c805b2225b0abab3611bc30bcd981f7af06602c5a436a659557430c1fac214d Oct 04 05:40:31 crc kubenswrapper[4574]: I1004 05:40:31.041539 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" event={"ID":"a93ef4d1-6946-43d0-aa80-63cc0f180346","Type":"ContainerStarted","Data":"0c805b2225b0abab3611bc30bcd981f7af06602c5a436a659557430c1fac214d"} Oct 04 05:40:43 crc kubenswrapper[4574]: I1004 05:40:43.168804 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" event={"ID":"a93ef4d1-6946-43d0-aa80-63cc0f180346","Type":"ContainerStarted","Data":"c9e099becdd9aa247bc8ef30d0f1268a24165b125aea1a4811b9cbb815db82dd"} Oct 04 05:40:43 crc kubenswrapper[4574]: I1004 05:40:43.188703 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" podStartSLOduration=1.176520902 podStartE2EDuration="13.188681924s" podCreationTimestamp="2025-10-04 05:40:30 +0000 UTC" firstStartedPulling="2025-10-04 05:40:30.866885109 +0000 UTC m=+3256.721028161" lastFinishedPulling="2025-10-04 05:40:42.879046141 +0000 UTC m=+3268.733189183" observedRunningTime="2025-10-04 05:40:43.183427812 +0000 UTC m=+3269.037570854" watchObservedRunningTime="2025-10-04 05:40:43.188681924 +0000 UTC m=+3269.042824966" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.631994 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5dw5k"] Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.634389 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.653075 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dw5k"] Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.721403 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-utilities\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.721742 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-catalog-content\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.721854 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5nh9\" (UniqueName: \"kubernetes.io/projected/9786b87d-8131-4830-8377-f2ef7f52cd74-kube-api-access-s5nh9\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.824770 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-utilities\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.824958 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-catalog-content\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.825013 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5nh9\" (UniqueName: \"kubernetes.io/projected/9786b87d-8131-4830-8377-f2ef7f52cd74-kube-api-access-s5nh9\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.825738 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-utilities\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.826564 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-catalog-content\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.851270 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5nh9\" (UniqueName: \"kubernetes.io/projected/9786b87d-8131-4830-8377-f2ef7f52cd74-kube-api-access-s5nh9\") pod \"certified-operators-5dw5k\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:14 crc kubenswrapper[4574]: I1004 05:41:14.958934 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:15 crc kubenswrapper[4574]: I1004 05:41:15.650024 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dw5k"] Oct 04 05:41:16 crc kubenswrapper[4574]: I1004 05:41:16.483047 4574 generic.go:334] "Generic (PLEG): container finished" podID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerID="4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996" exitCode=0 Oct 04 05:41:16 crc kubenswrapper[4574]: I1004 05:41:16.483136 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerDied","Data":"4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996"} Oct 04 05:41:16 crc kubenswrapper[4574]: I1004 05:41:16.483649 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerStarted","Data":"aee37447f55880731f5c76f8fffd15c54f470569baeb6c31336b0a9b238249da"} Oct 04 05:41:18 crc kubenswrapper[4574]: I1004 05:41:18.510425 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerStarted","Data":"f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd"} Oct 04 05:41:20 crc kubenswrapper[4574]: E1004 05:41:20.181157 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9786b87d_8131_4830_8377_f2ef7f52cd74.slice/crio-conmon-f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd.scope\": RecentStats: unable to find data in memory cache]" Oct 04 05:41:20 crc kubenswrapper[4574]: I1004 05:41:20.546166 4574 generic.go:334] "Generic (PLEG): container finished" podID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerID="f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd" exitCode=0 Oct 04 05:41:20 crc kubenswrapper[4574]: I1004 05:41:20.546263 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerDied","Data":"f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd"} Oct 04 05:41:21 crc kubenswrapper[4574]: I1004 05:41:21.569487 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerStarted","Data":"330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46"} Oct 04 05:41:21 crc kubenswrapper[4574]: I1004 05:41:21.588454 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5dw5k" podStartSLOduration=2.961209044 podStartE2EDuration="7.588433793s" podCreationTimestamp="2025-10-04 05:41:14 +0000 UTC" firstStartedPulling="2025-10-04 05:41:16.488324181 +0000 UTC m=+3302.342467213" lastFinishedPulling="2025-10-04 05:41:21.11554892 +0000 UTC m=+3306.969691962" observedRunningTime="2025-10-04 05:41:21.586981261 +0000 UTC m=+3307.441124303" watchObservedRunningTime="2025-10-04 05:41:21.588433793 +0000 UTC m=+3307.442576835" Oct 04 05:41:24 crc kubenswrapper[4574]: I1004 05:41:24.959948 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:24 crc kubenswrapper[4574]: I1004 05:41:24.961538 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:26 crc kubenswrapper[4574]: I1004 05:41:26.020258 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5dw5k" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="registry-server" probeResult="failure" output=< Oct 04 05:41:26 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:41:26 crc kubenswrapper[4574]: > Oct 04 05:41:35 crc kubenswrapper[4574]: I1004 05:41:35.022724 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:35 crc kubenswrapper[4574]: I1004 05:41:35.107451 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:35 crc kubenswrapper[4574]: I1004 05:41:35.261149 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dw5k"] Oct 04 05:41:36 crc kubenswrapper[4574]: I1004 05:41:36.718575 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5dw5k" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="registry-server" containerID="cri-o://330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46" gracePeriod=2 Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.274090 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.347530 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5nh9\" (UniqueName: \"kubernetes.io/projected/9786b87d-8131-4830-8377-f2ef7f52cd74-kube-api-access-s5nh9\") pod \"9786b87d-8131-4830-8377-f2ef7f52cd74\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.347697 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-catalog-content\") pod \"9786b87d-8131-4830-8377-f2ef7f52cd74\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.347767 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-utilities\") pod \"9786b87d-8131-4830-8377-f2ef7f52cd74\" (UID: \"9786b87d-8131-4830-8377-f2ef7f52cd74\") " Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.348595 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-utilities" (OuterVolumeSpecName: "utilities") pod "9786b87d-8131-4830-8377-f2ef7f52cd74" (UID: "9786b87d-8131-4830-8377-f2ef7f52cd74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.378469 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9786b87d-8131-4830-8377-f2ef7f52cd74-kube-api-access-s5nh9" (OuterVolumeSpecName: "kube-api-access-s5nh9") pod "9786b87d-8131-4830-8377-f2ef7f52cd74" (UID: "9786b87d-8131-4830-8377-f2ef7f52cd74"). InnerVolumeSpecName "kube-api-access-s5nh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.423045 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9786b87d-8131-4830-8377-f2ef7f52cd74" (UID: "9786b87d-8131-4830-8377-f2ef7f52cd74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.450144 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5nh9\" (UniqueName: \"kubernetes.io/projected/9786b87d-8131-4830-8377-f2ef7f52cd74-kube-api-access-s5nh9\") on node \"crc\" DevicePath \"\"" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.450190 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.450202 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9786b87d-8131-4830-8377-f2ef7f52cd74-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.733923 4574 generic.go:334] "Generic (PLEG): container finished" podID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerID="330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46" exitCode=0 Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.733957 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dw5k" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.733976 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerDied","Data":"330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46"} Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.735992 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dw5k" event={"ID":"9786b87d-8131-4830-8377-f2ef7f52cd74","Type":"ContainerDied","Data":"aee37447f55880731f5c76f8fffd15c54f470569baeb6c31336b0a9b238249da"} Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.736024 4574 scope.go:117] "RemoveContainer" containerID="330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.778542 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dw5k"] Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.779530 4574 scope.go:117] "RemoveContainer" containerID="f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.788496 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5dw5k"] Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.803385 4574 scope.go:117] "RemoveContainer" containerID="4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.856692 4574 scope.go:117] "RemoveContainer" containerID="330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46" Oct 04 05:41:37 crc kubenswrapper[4574]: E1004 05:41:37.857076 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46\": container with ID starting with 330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46 not found: ID does not exist" containerID="330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.857198 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46"} err="failed to get container status \"330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46\": rpc error: code = NotFound desc = could not find container \"330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46\": container with ID starting with 330e32ddecda447a653c25a694ac176d83bc9cc8f05543db77b3809051455b46 not found: ID does not exist" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.857338 4574 scope.go:117] "RemoveContainer" containerID="f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd" Oct 04 05:41:37 crc kubenswrapper[4574]: E1004 05:41:37.857697 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd\": container with ID starting with f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd not found: ID does not exist" containerID="f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.857724 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd"} err="failed to get container status \"f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd\": rpc error: code = NotFound desc = could not find container \"f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd\": container with ID starting with f9951b0f873cfc21c22ad1674765f701fd6a01180f7d51ebec3ab47db87acedd not found: ID does not exist" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.857746 4574 scope.go:117] "RemoveContainer" containerID="4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996" Oct 04 05:41:37 crc kubenswrapper[4574]: E1004 05:41:37.858053 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996\": container with ID starting with 4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996 not found: ID does not exist" containerID="4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996" Oct 04 05:41:37 crc kubenswrapper[4574]: I1004 05:41:37.858110 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996"} err="failed to get container status \"4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996\": rpc error: code = NotFound desc = could not find container \"4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996\": container with ID starting with 4dd9537238f291efacb01b4950c0672fe175ff875a5b5eb01bab037727c07996 not found: ID does not exist" Oct 04 05:41:38 crc kubenswrapper[4574]: I1004 05:41:38.752751 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" path="/var/lib/kubelet/pods/9786b87d-8131-4830-8377-f2ef7f52cd74/volumes" Oct 04 05:41:51 crc kubenswrapper[4574]: I1004 05:41:51.201097 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-766d778598-9bz6b_c224adb6-7a04-4bd4-bc6a-d8c484c8710e/barbican-api/0.log" Oct 04 05:41:51 crc kubenswrapper[4574]: I1004 05:41:51.287910 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-766d778598-9bz6b_c224adb6-7a04-4bd4-bc6a-d8c484c8710e/barbican-api-log/0.log" Oct 04 05:41:51 crc kubenswrapper[4574]: I1004 05:41:51.535175 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5475848bb4-qk59c_5438cd90-23bc-4da2-8856-519b7656f8ff/barbican-keystone-listener/0.log" Oct 04 05:41:51 crc kubenswrapper[4574]: I1004 05:41:51.709563 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5475848bb4-qk59c_5438cd90-23bc-4da2-8856-519b7656f8ff/barbican-keystone-listener-log/0.log" Oct 04 05:41:51 crc kubenswrapper[4574]: I1004 05:41:51.937966 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86595bb85-v84cq_857fe45e-27ff-44ef-b58c-9e1278946927/barbican-worker/0.log" Oct 04 05:41:52 crc kubenswrapper[4574]: I1004 05:41:52.006705 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86595bb85-v84cq_857fe45e-27ff-44ef-b58c-9e1278946927/barbican-worker-log/0.log" Oct 04 05:41:52 crc kubenswrapper[4574]: I1004 05:41:52.253831 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2_1e9631eb-d051-4087-81eb-2f33ea4dd993/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:52 crc kubenswrapper[4574]: I1004 05:41:52.456928 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/ceilometer-central-agent/0.log" Oct 04 05:41:52 crc kubenswrapper[4574]: I1004 05:41:52.551684 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/proxy-httpd/0.log" Oct 04 05:41:52 crc kubenswrapper[4574]: I1004 05:41:52.612955 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/ceilometer-notification-agent/0.log" Oct 04 05:41:52 crc kubenswrapper[4574]: I1004 05:41:52.866001 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/sg-core/0.log" Oct 04 05:41:53 crc kubenswrapper[4574]: I1004 05:41:53.216708 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_38984f83-1657-45b8-bcd4-448c2306ea86/cinder-api/0.log" Oct 04 05:41:53 crc kubenswrapper[4574]: I1004 05:41:53.283481 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_38984f83-1657-45b8-bcd4-448c2306ea86/cinder-api-log/0.log" Oct 04 05:41:53 crc kubenswrapper[4574]: I1004 05:41:53.509054 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_161f98e1-5520-4148-8565-05394e7e8daf/probe/0.log" Oct 04 05:41:53 crc kubenswrapper[4574]: I1004 05:41:53.653247 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_161f98e1-5520-4148-8565-05394e7e8daf/cinder-scheduler/0.log" Oct 04 05:41:53 crc kubenswrapper[4574]: I1004 05:41:53.851848 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vj82x_75cb1602-ada9-4442-be91-3fa85a464d5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:54 crc kubenswrapper[4574]: I1004 05:41:54.064420 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-65256_da9c2287-2920-4152-bf57-7eb8effbea81/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:54 crc kubenswrapper[4574]: I1004 05:41:54.297723 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-grdlr_9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:54 crc kubenswrapper[4574]: I1004 05:41:54.485970 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44rq7_3f536491-9237-4de1-b43d-2ffefcf26eb8/init/0.log" Oct 04 05:41:54 crc kubenswrapper[4574]: I1004 05:41:54.802784 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44rq7_3f536491-9237-4de1-b43d-2ffefcf26eb8/init/0.log" Oct 04 05:41:54 crc kubenswrapper[4574]: I1004 05:41:54.817131 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44rq7_3f536491-9237-4de1-b43d-2ffefcf26eb8/dnsmasq-dns/0.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.032254 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs_b8508613-3769-4000-9037-bce43bf206bb/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.139928 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5f5514e6-eceb-4683-9633-684cc13d5458/glance-httpd/0.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.224363 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5f5514e6-eceb-4683-9633-684cc13d5458/glance-log/0.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.452458 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7/glance-httpd/0.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.498374 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7/glance-log/0.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.779364 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57bfb4d496-nv6hv_85281a42-f9ab-4302-9fe9-4e742075530f/horizon/2.log" Oct 04 05:41:55 crc kubenswrapper[4574]: I1004 05:41:55.831150 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57bfb4d496-nv6hv_85281a42-f9ab-4302-9fe9-4e742075530f/horizon/1.log" Oct 04 05:41:56 crc kubenswrapper[4574]: I1004 05:41:56.144591 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57bfb4d496-nv6hv_85281a42-f9ab-4302-9fe9-4e742075530f/horizon-log/0.log" Oct 04 05:41:56 crc kubenswrapper[4574]: I1004 05:41:56.349327 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s_5b869cbb-6227-4391-9faf-2565fc5a4acd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:56 crc kubenswrapper[4574]: I1004 05:41:56.519679 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kvnsk_39389db3-7317-49b0-af09-e9459d02c5e7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:56 crc kubenswrapper[4574]: I1004 05:41:56.778141 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cd47459a-7171-4d7e-8f65-20a2936ce760/kube-state-metrics/0.log" Oct 04 05:41:56 crc kubenswrapper[4574]: I1004 05:41:56.869439 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78786b8bfb-qgltl_1e4a50fe-8cee-4243-a215-9c82e358ea30/keystone-api/0.log" Oct 04 05:41:57 crc kubenswrapper[4574]: I1004 05:41:57.067953 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qhprs_7f92a088-639a-4112-910b-bb2a76600bac/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:57 crc kubenswrapper[4574]: I1004 05:41:57.408561 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54746bc5fc-22pbj_e736cc6e-edb6-4fad-8687-6c4e2a85d0a0/neutron-httpd/0.log" Oct 04 05:41:57 crc kubenswrapper[4574]: I1004 05:41:57.427349 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54746bc5fc-22pbj_e736cc6e-edb6-4fad-8687-6c4e2a85d0a0/neutron-api/0.log" Oct 04 05:41:57 crc kubenswrapper[4574]: I1004 05:41:57.681550 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt_658de4d9-d56d-45fd-b0bc-781bbbb30a5e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:58 crc kubenswrapper[4574]: I1004 05:41:58.202880 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_97365d9d-d7a3-42b9-8131-54dea698f6f8/nova-api-log/0.log" Oct 04 05:41:58 crc kubenswrapper[4574]: I1004 05:41:58.342994 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_97365d9d-d7a3-42b9-8131-54dea698f6f8/nova-api-api/0.log" Oct 04 05:41:58 crc kubenswrapper[4574]: I1004 05:41:58.498361 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8377c768-d10d-49d6-b43f-b1aeedcdeae6/nova-cell0-conductor-conductor/0.log" Oct 04 05:41:58 crc kubenswrapper[4574]: I1004 05:41:58.788399 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f6ae6da5-dd08-46b4-94cf-589b9c4f5139/nova-cell1-conductor-conductor/0.log" Oct 04 05:41:59 crc kubenswrapper[4574]: I1004 05:41:59.021588 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_43dfa220-f267-43c2-9b28-4dc23a4a3eeb/nova-cell1-novncproxy-novncproxy/0.log" Oct 04 05:41:59 crc kubenswrapper[4574]: I1004 05:41:59.218991 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-th9xm_d85707e9-6bd8-4f36-b3c0-d8a0ccc88811/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:41:59 crc kubenswrapper[4574]: I1004 05:41:59.600754 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7e8c70bd-bcf3-4379-a026-5a52411a56ab/nova-metadata-log/0.log" Oct 04 05:42:00 crc kubenswrapper[4574]: I1004 05:42:00.064595 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c/nova-scheduler-scheduler/0.log" Oct 04 05:42:00 crc kubenswrapper[4574]: I1004 05:42:00.380862 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f275e3ec-6c93-412b-875c-65b03a785dc0/mysql-bootstrap/0.log" Oct 04 05:42:00 crc kubenswrapper[4574]: I1004 05:42:00.698519 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7e8c70bd-bcf3-4379-a026-5a52411a56ab/nova-metadata-metadata/0.log" Oct 04 05:42:00 crc kubenswrapper[4574]: I1004 05:42:00.707297 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f275e3ec-6c93-412b-875c-65b03a785dc0/galera/0.log" Oct 04 05:42:00 crc kubenswrapper[4574]: I1004 05:42:00.712347 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f275e3ec-6c93-412b-875c-65b03a785dc0/mysql-bootstrap/0.log" Oct 04 05:42:01 crc kubenswrapper[4574]: I1004 05:42:01.068673 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c862e2a0-256a-470f-b35b-c244555f0c5f/mysql-bootstrap/0.log" Oct 04 05:42:01 crc kubenswrapper[4574]: I1004 05:42:01.315541 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c862e2a0-256a-470f-b35b-c244555f0c5f/mysql-bootstrap/0.log" Oct 04 05:42:01 crc kubenswrapper[4574]: I1004 05:42:01.322431 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c862e2a0-256a-470f-b35b-c244555f0c5f/galera/0.log" Oct 04 05:42:01 crc kubenswrapper[4574]: I1004 05:42:01.599948 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-khsmk_3836030c-f0c4-4392-bc54-cc817fd89934/ovn-controller/0.log" Oct 04 05:42:01 crc kubenswrapper[4574]: I1004 05:42:01.661905 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2552db74-0d8b-4ca0-af2e-092c03e097f2/openstackclient/0.log" Oct 04 05:42:01 crc kubenswrapper[4574]: I1004 05:42:01.947393 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4576m_fb659229-980c-4368-a799-f0db3f3330da/openstack-network-exporter/0.log" Oct 04 05:42:02 crc kubenswrapper[4574]: I1004 05:42:02.198417 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovsdb-server-init/0.log" Oct 04 05:42:02 crc kubenswrapper[4574]: I1004 05:42:02.469355 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovsdb-server-init/0.log" Oct 04 05:42:02 crc kubenswrapper[4574]: I1004 05:42:02.544850 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovsdb-server/0.log" Oct 04 05:42:02 crc kubenswrapper[4574]: I1004 05:42:02.564482 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovs-vswitchd/0.log" Oct 04 05:42:02 crc kubenswrapper[4574]: I1004 05:42:02.854336 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-pph2t_b1cddc5d-210f-4762-9f80-1b055ad2239b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:03 crc kubenswrapper[4574]: I1004 05:42:03.088919 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e306e34-ea03-4a60-9adc-99f30618be02/openstack-network-exporter/0.log" Oct 04 05:42:03 crc kubenswrapper[4574]: I1004 05:42:03.164965 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e306e34-ea03-4a60-9adc-99f30618be02/ovn-northd/0.log" Oct 04 05:42:03 crc kubenswrapper[4574]: I1004 05:42:03.375259 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cd1d5524-8818-4988-9969-45c2f2904fb4/openstack-network-exporter/0.log" Oct 04 05:42:03 crc kubenswrapper[4574]: I1004 05:42:03.435817 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cd1d5524-8818-4988-9969-45c2f2904fb4/ovsdbserver-nb/0.log" Oct 04 05:42:03 crc kubenswrapper[4574]: I1004 05:42:03.677375 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e5b7c0f-9b1c-411c-94b0-f57b8157c998/openstack-network-exporter/0.log" Oct 04 05:42:03 crc kubenswrapper[4574]: I1004 05:42:03.721967 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e5b7c0f-9b1c-411c-94b0-f57b8157c998/ovsdbserver-sb/0.log" Oct 04 05:42:04 crc kubenswrapper[4574]: I1004 05:42:04.221588 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c85977bcb-np6n7_462b910b-39e1-4a9e-a82c-3cfe77462a97/placement-api/0.log" Oct 04 05:42:04 crc kubenswrapper[4574]: I1004 05:42:04.360732 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c85977bcb-np6n7_462b910b-39e1-4a9e-a82c-3cfe77462a97/placement-log/0.log" Oct 04 05:42:04 crc kubenswrapper[4574]: I1004 05:42:04.529811 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bbad2653-45e8-4eb2-b7f8-60e6dcee36f2/setup-container/0.log" Oct 04 05:42:04 crc kubenswrapper[4574]: I1004 05:42:04.838025 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bbad2653-45e8-4eb2-b7f8-60e6dcee36f2/setup-container/0.log" Oct 04 05:42:04 crc kubenswrapper[4574]: I1004 05:42:04.859752 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bbad2653-45e8-4eb2-b7f8-60e6dcee36f2/rabbitmq/0.log" Oct 04 05:42:05 crc kubenswrapper[4574]: I1004 05:42:05.092653 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1298ffd0-9c09-4f29-b8bf-eaff9018fcb4/setup-container/0.log" Oct 04 05:42:05 crc kubenswrapper[4574]: I1004 05:42:05.330252 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1298ffd0-9c09-4f29-b8bf-eaff9018fcb4/setup-container/0.log" Oct 04 05:42:05 crc kubenswrapper[4574]: I1004 05:42:05.448806 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1298ffd0-9c09-4f29-b8bf-eaff9018fcb4/rabbitmq/0.log" Oct 04 05:42:05 crc kubenswrapper[4574]: I1004 05:42:05.640219 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6_eba11170-e0cf-4e7a-8e9a-771fde74bff1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:05 crc kubenswrapper[4574]: I1004 05:42:05.821148 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-74r7r_5ba9a62a-eb41-401f-ac26-779fb50b276a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:06 crc kubenswrapper[4574]: I1004 05:42:06.100013 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf_f0a5e204-886d-416f-96ad-46cc7715e417/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:06 crc kubenswrapper[4574]: I1004 05:42:06.318804 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jr6kt_0cad2098-82fe-4efb-89a6-a440ad6f73dc/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:06 crc kubenswrapper[4574]: I1004 05:42:06.365922 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5hpm9_b51be97d-af6b-432b-a671-040de2d05471/ssh-known-hosts-edpm-deployment/0.log" Oct 04 05:42:06 crc kubenswrapper[4574]: I1004 05:42:06.746556 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7688fc9d67-qlxww_710de145-ae9a-41bf-9b90-564a1e4acee6/proxy-httpd/0.log" Oct 04 05:42:06 crc kubenswrapper[4574]: I1004 05:42:06.766867 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7688fc9d67-qlxww_710de145-ae9a-41bf-9b90-564a1e4acee6/proxy-server/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.025535 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gbs8h_65ae5a48-3442-4149-9dbd-ac23191fa438/swift-ring-rebalance/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.273174 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-reaper/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.344532 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-auditor/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.491484 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-replicator/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.538393 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-server/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.585624 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-auditor/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.767397 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-replicator/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.814379 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-updater/0.log" Oct 04 05:42:07 crc kubenswrapper[4574]: I1004 05:42:07.855762 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-server/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.007288 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-auditor/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.067156 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-expirer/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.169171 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-replicator/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.313049 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-updater/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.359078 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-server/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.467935 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/rsync/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.636517 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/swift-recon-cron/0.log" Oct 04 05:42:08 crc kubenswrapper[4574]: I1004 05:42:08.875633 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw_8393cfca-67a9-4740-bb68-8a6cfe3f12b4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:09 crc kubenswrapper[4574]: I1004 05:42:09.067411 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_20e889e6-41a7-4c36-ac15-8dc429f15aeb/tempest-tests-tempest-tests-runner/0.log" Oct 04 05:42:09 crc kubenswrapper[4574]: I1004 05:42:09.274946 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f0d647c3-a19e-44ce-9e3e-be13cf6e9586/test-operator-logs-container/0.log" Oct 04 05:42:09 crc kubenswrapper[4574]: I1004 05:42:09.578331 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z98sd_b688d23c-d5f8-4fc1-bd58-8e710dae393b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.281800 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8pqtw"] Oct 04 05:42:14 crc kubenswrapper[4574]: E1004 05:42:14.283942 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="registry-server" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.283957 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="registry-server" Oct 04 05:42:14 crc kubenswrapper[4574]: E1004 05:42:14.283989 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="extract-utilities" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.283995 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="extract-utilities" Oct 04 05:42:14 crc kubenswrapper[4574]: E1004 05:42:14.284015 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="extract-content" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.284020 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="extract-content" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.284203 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9786b87d-8131-4830-8377-f2ef7f52cd74" containerName="registry-server" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.289605 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.315889 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pqtw"] Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.393451 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/f0010a0c-010e-4f21-86cd-a5238bb1efd9-kube-api-access-sqnf6\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.393573 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-catalog-content\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.393603 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-utilities\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.495217 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/f0010a0c-010e-4f21-86cd-a5238bb1efd9-kube-api-access-sqnf6\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.495337 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-catalog-content\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.495367 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-utilities\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.496330 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-utilities\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.496552 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-catalog-content\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.550379 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/f0010a0c-010e-4f21-86cd-a5238bb1efd9-kube-api-access-sqnf6\") pod \"redhat-operators-8pqtw\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:14 crc kubenswrapper[4574]: I1004 05:42:14.653506 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:16 crc kubenswrapper[4574]: I1004 05:42:15.334285 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pqtw"] Oct 04 05:42:16 crc kubenswrapper[4574]: I1004 05:42:16.191583 4574 generic.go:334] "Generic (PLEG): container finished" podID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerID="57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b" exitCode=0 Oct 04 05:42:16 crc kubenswrapper[4574]: I1004 05:42:16.191655 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerDied","Data":"57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b"} Oct 04 05:42:16 crc kubenswrapper[4574]: I1004 05:42:16.192331 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerStarted","Data":"dcec311dd9ed0c8e89a1a6daac7d608abb16effeb2fbe844745282c0f7225285"} Oct 04 05:42:16 crc kubenswrapper[4574]: I1004 05:42:16.197318 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:42:18 crc kubenswrapper[4574]: I1004 05:42:18.237063 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerStarted","Data":"2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5"} Oct 04 05:42:19 crc kubenswrapper[4574]: I1004 05:42:19.338498 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3/memcached/0.log" Oct 04 05:42:19 crc kubenswrapper[4574]: I1004 05:42:19.404647 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:42:19 crc kubenswrapper[4574]: I1004 05:42:19.404919 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:42:22 crc kubenswrapper[4574]: I1004 05:42:22.300971 4574 generic.go:334] "Generic (PLEG): container finished" podID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerID="2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5" exitCode=0 Oct 04 05:42:22 crc kubenswrapper[4574]: I1004 05:42:22.301519 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerDied","Data":"2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5"} Oct 04 05:42:23 crc kubenswrapper[4574]: I1004 05:42:23.312749 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerStarted","Data":"5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b"} Oct 04 05:42:24 crc kubenswrapper[4574]: I1004 05:42:24.657948 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:24 crc kubenswrapper[4574]: I1004 05:42:24.658252 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:25 crc kubenswrapper[4574]: I1004 05:42:25.726843 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8pqtw" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="registry-server" probeResult="failure" output=< Oct 04 05:42:25 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:42:25 crc kubenswrapper[4574]: > Oct 04 05:42:35 crc kubenswrapper[4574]: I1004 05:42:35.706980 4574 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8pqtw" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="registry-server" probeResult="failure" output=< Oct 04 05:42:35 crc kubenswrapper[4574]: timeout: failed to connect service ":50051" within 1s Oct 04 05:42:35 crc kubenswrapper[4574]: > Oct 04 05:42:44 crc kubenswrapper[4574]: I1004 05:42:44.708332 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:44 crc kubenswrapper[4574]: I1004 05:42:44.735508 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8pqtw" podStartSLOduration=24.016824858 podStartE2EDuration="30.735490919s" podCreationTimestamp="2025-10-04 05:42:14 +0000 UTC" firstStartedPulling="2025-10-04 05:42:16.197005083 +0000 UTC m=+3362.051148135" lastFinishedPulling="2025-10-04 05:42:22.915671154 +0000 UTC m=+3368.769814196" observedRunningTime="2025-10-04 05:42:23.332373741 +0000 UTC m=+3369.186516783" watchObservedRunningTime="2025-10-04 05:42:44.735490919 +0000 UTC m=+3390.589633961" Oct 04 05:42:44 crc kubenswrapper[4574]: I1004 05:42:44.762601 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:45 crc kubenswrapper[4574]: I1004 05:42:45.479338 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pqtw"] Oct 04 05:42:46 crc kubenswrapper[4574]: I1004 05:42:46.534965 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8pqtw" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="registry-server" containerID="cri-o://5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b" gracePeriod=2 Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.017339 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.111716 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-utilities\") pod \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.111814 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/f0010a0c-010e-4f21-86cd-a5238bb1efd9-kube-api-access-sqnf6\") pod \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.112032 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-catalog-content\") pod \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\" (UID: \"f0010a0c-010e-4f21-86cd-a5238bb1efd9\") " Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.116729 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-utilities" (OuterVolumeSpecName: "utilities") pod "f0010a0c-010e-4f21-86cd-a5238bb1efd9" (UID: "f0010a0c-010e-4f21-86cd-a5238bb1efd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.123539 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0010a0c-010e-4f21-86cd-a5238bb1efd9-kube-api-access-sqnf6" (OuterVolumeSpecName: "kube-api-access-sqnf6") pod "f0010a0c-010e-4f21-86cd-a5238bb1efd9" (UID: "f0010a0c-010e-4f21-86cd-a5238bb1efd9"). InnerVolumeSpecName "kube-api-access-sqnf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.214151 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0010a0c-010e-4f21-86cd-a5238bb1efd9" (UID: "f0010a0c-010e-4f21-86cd-a5238bb1efd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.214726 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.214817 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0010a0c-010e-4f21-86cd-a5238bb1efd9-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.214886 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/f0010a0c-010e-4f21-86cd-a5238bb1efd9-kube-api-access-sqnf6\") on node \"crc\" DevicePath \"\"" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.544964 4574 generic.go:334] "Generic (PLEG): container finished" podID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerID="5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b" exitCode=0 Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.545045 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pqtw" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.545052 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerDied","Data":"5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b"} Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.545983 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pqtw" event={"ID":"f0010a0c-010e-4f21-86cd-a5238bb1efd9","Type":"ContainerDied","Data":"dcec311dd9ed0c8e89a1a6daac7d608abb16effeb2fbe844745282c0f7225285"} Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.546013 4574 scope.go:117] "RemoveContainer" containerID="5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.578432 4574 scope.go:117] "RemoveContainer" containerID="2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.588290 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pqtw"] Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.598262 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8pqtw"] Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.613869 4574 scope.go:117] "RemoveContainer" containerID="57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.650868 4574 scope.go:117] "RemoveContainer" containerID="5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b" Oct 04 05:42:47 crc kubenswrapper[4574]: E1004 05:42:47.651369 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b\": container with ID starting with 5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b not found: ID does not exist" containerID="5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.651410 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b"} err="failed to get container status \"5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b\": rpc error: code = NotFound desc = could not find container \"5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b\": container with ID starting with 5b7d5a4f2c37bcb3c079098e91ee41652991e4ff78e83394121cb1e50450656b not found: ID does not exist" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.651437 4574 scope.go:117] "RemoveContainer" containerID="2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5" Oct 04 05:42:47 crc kubenswrapper[4574]: E1004 05:42:47.651854 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5\": container with ID starting with 2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5 not found: ID does not exist" containerID="2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.651876 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5"} err="failed to get container status \"2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5\": rpc error: code = NotFound desc = could not find container \"2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5\": container with ID starting with 2d604c8641ce3da1506bcb0cca725ed39778eca7dce5d0ba1ce27fd34f4951b5 not found: ID does not exist" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.651890 4574 scope.go:117] "RemoveContainer" containerID="57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b" Oct 04 05:42:47 crc kubenswrapper[4574]: E1004 05:42:47.652165 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b\": container with ID starting with 57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b not found: ID does not exist" containerID="57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b" Oct 04 05:42:47 crc kubenswrapper[4574]: I1004 05:42:47.652208 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b"} err="failed to get container status \"57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b\": rpc error: code = NotFound desc = could not find container \"57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b\": container with ID starting with 57fdae5e9f4f5d88087b56dbce32e29256546cf7229c4484466becae9b57a68b not found: ID does not exist" Oct 04 05:42:48 crc kubenswrapper[4574]: I1004 05:42:48.752870 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" path="/var/lib/kubelet/pods/f0010a0c-010e-4f21-86cd-a5238bb1efd9/volumes" Oct 04 05:42:49 crc kubenswrapper[4574]: I1004 05:42:49.404845 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:42:49 crc kubenswrapper[4574]: I1004 05:42:49.405360 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:42:56 crc kubenswrapper[4574]: I1004 05:42:56.624089 4574 generic.go:334] "Generic (PLEG): container finished" podID="a93ef4d1-6946-43d0-aa80-63cc0f180346" containerID="c9e099becdd9aa247bc8ef30d0f1268a24165b125aea1a4811b9cbb815db82dd" exitCode=0 Oct 04 05:42:56 crc kubenswrapper[4574]: I1004 05:42:56.624171 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" event={"ID":"a93ef4d1-6946-43d0-aa80-63cc0f180346","Type":"ContainerDied","Data":"c9e099becdd9aa247bc8ef30d0f1268a24165b125aea1a4811b9cbb815db82dd"} Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.725798 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.774723 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-78pt2"] Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.782957 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-78pt2"] Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.814883 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2kjp\" (UniqueName: \"kubernetes.io/projected/a93ef4d1-6946-43d0-aa80-63cc0f180346-kube-api-access-c2kjp\") pod \"a93ef4d1-6946-43d0-aa80-63cc0f180346\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.815498 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a93ef4d1-6946-43d0-aa80-63cc0f180346-host\") pod \"a93ef4d1-6946-43d0-aa80-63cc0f180346\" (UID: \"a93ef4d1-6946-43d0-aa80-63cc0f180346\") " Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.815668 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a93ef4d1-6946-43d0-aa80-63cc0f180346-host" (OuterVolumeSpecName: "host") pod "a93ef4d1-6946-43d0-aa80-63cc0f180346" (UID: "a93ef4d1-6946-43d0-aa80-63cc0f180346"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.817113 4574 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a93ef4d1-6946-43d0-aa80-63cc0f180346-host\") on node \"crc\" DevicePath \"\"" Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.821551 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93ef4d1-6946-43d0-aa80-63cc0f180346-kube-api-access-c2kjp" (OuterVolumeSpecName: "kube-api-access-c2kjp") pod "a93ef4d1-6946-43d0-aa80-63cc0f180346" (UID: "a93ef4d1-6946-43d0-aa80-63cc0f180346"). InnerVolumeSpecName "kube-api-access-c2kjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:42:57 crc kubenswrapper[4574]: I1004 05:42:57.919113 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2kjp\" (UniqueName: \"kubernetes.io/projected/a93ef4d1-6946-43d0-aa80-63cc0f180346-kube-api-access-c2kjp\") on node \"crc\" DevicePath \"\"" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.640848 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c805b2225b0abab3611bc30bcd981f7af06602c5a436a659557430c1fac214d" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.640905 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-78pt2" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.742611 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93ef4d1-6946-43d0-aa80-63cc0f180346" path="/var/lib/kubelet/pods/a93ef4d1-6946-43d0-aa80-63cc0f180346/volumes" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.948483 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-ssvfd"] Oct 04 05:42:58 crc kubenswrapper[4574]: E1004 05:42:58.948844 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="extract-content" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.948860 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="extract-content" Oct 04 05:42:58 crc kubenswrapper[4574]: E1004 05:42:58.948871 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="registry-server" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.948878 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="registry-server" Oct 04 05:42:58 crc kubenswrapper[4574]: E1004 05:42:58.948914 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="extract-utilities" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.948919 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="extract-utilities" Oct 04 05:42:58 crc kubenswrapper[4574]: E1004 05:42:58.948933 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93ef4d1-6946-43d0-aa80-63cc0f180346" containerName="container-00" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.948938 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93ef4d1-6946-43d0-aa80-63cc0f180346" containerName="container-00" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.949102 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0010a0c-010e-4f21-86cd-a5238bb1efd9" containerName="registry-server" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.949119 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93ef4d1-6946-43d0-aa80-63cc0f180346" containerName="container-00" Oct 04 05:42:58 crc kubenswrapper[4574]: I1004 05:42:58.950701 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.040030 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cqh\" (UniqueName: \"kubernetes.io/projected/e30ca915-ff10-4399-92f2-7bf260b81f4e-kube-api-access-68cqh\") pod \"crc-debug-ssvfd\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.040087 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e30ca915-ff10-4399-92f2-7bf260b81f4e-host\") pod \"crc-debug-ssvfd\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.141938 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cqh\" (UniqueName: \"kubernetes.io/projected/e30ca915-ff10-4399-92f2-7bf260b81f4e-kube-api-access-68cqh\") pod \"crc-debug-ssvfd\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.142009 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e30ca915-ff10-4399-92f2-7bf260b81f4e-host\") pod \"crc-debug-ssvfd\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.142180 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e30ca915-ff10-4399-92f2-7bf260b81f4e-host\") pod \"crc-debug-ssvfd\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.170535 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cqh\" (UniqueName: \"kubernetes.io/projected/e30ca915-ff10-4399-92f2-7bf260b81f4e-kube-api-access-68cqh\") pod \"crc-debug-ssvfd\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.312818 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.649972 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" event={"ID":"e30ca915-ff10-4399-92f2-7bf260b81f4e","Type":"ContainerStarted","Data":"fe4ac8ac5ea2a29176521a18b49dbfad780ce490295f6a171f0758362e15ade3"} Oct 04 05:42:59 crc kubenswrapper[4574]: I1004 05:42:59.650301 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" event={"ID":"e30ca915-ff10-4399-92f2-7bf260b81f4e","Type":"ContainerStarted","Data":"35f7e3e37d134658d8602dacdd4f1e502cd7ac9fcb22570fdb3981f44cb4d4f3"} Oct 04 05:43:00 crc kubenswrapper[4574]: I1004 05:43:00.659122 4574 generic.go:334] "Generic (PLEG): container finished" podID="e30ca915-ff10-4399-92f2-7bf260b81f4e" containerID="fe4ac8ac5ea2a29176521a18b49dbfad780ce490295f6a171f0758362e15ade3" exitCode=0 Oct 04 05:43:00 crc kubenswrapper[4574]: I1004 05:43:00.659342 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" event={"ID":"e30ca915-ff10-4399-92f2-7bf260b81f4e","Type":"ContainerDied","Data":"fe4ac8ac5ea2a29176521a18b49dbfad780ce490295f6a171f0758362e15ade3"} Oct 04 05:43:01 crc kubenswrapper[4574]: I1004 05:43:01.766810 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:43:01 crc kubenswrapper[4574]: I1004 05:43:01.894375 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e30ca915-ff10-4399-92f2-7bf260b81f4e-host\") pod \"e30ca915-ff10-4399-92f2-7bf260b81f4e\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " Oct 04 05:43:01 crc kubenswrapper[4574]: I1004 05:43:01.894482 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cqh\" (UniqueName: \"kubernetes.io/projected/e30ca915-ff10-4399-92f2-7bf260b81f4e-kube-api-access-68cqh\") pod \"e30ca915-ff10-4399-92f2-7bf260b81f4e\" (UID: \"e30ca915-ff10-4399-92f2-7bf260b81f4e\") " Oct 04 05:43:01 crc kubenswrapper[4574]: I1004 05:43:01.895153 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e30ca915-ff10-4399-92f2-7bf260b81f4e-host" (OuterVolumeSpecName: "host") pod "e30ca915-ff10-4399-92f2-7bf260b81f4e" (UID: "e30ca915-ff10-4399-92f2-7bf260b81f4e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:43:01 crc kubenswrapper[4574]: I1004 05:43:01.904843 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30ca915-ff10-4399-92f2-7bf260b81f4e-kube-api-access-68cqh" (OuterVolumeSpecName: "kube-api-access-68cqh") pod "e30ca915-ff10-4399-92f2-7bf260b81f4e" (UID: "e30ca915-ff10-4399-92f2-7bf260b81f4e"). InnerVolumeSpecName "kube-api-access-68cqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:43:01 crc kubenswrapper[4574]: I1004 05:43:01.999555 4574 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e30ca915-ff10-4399-92f2-7bf260b81f4e-host\") on node \"crc\" DevicePath \"\"" Oct 04 05:43:02 crc kubenswrapper[4574]: I1004 05:43:02.000032 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cqh\" (UniqueName: \"kubernetes.io/projected/e30ca915-ff10-4399-92f2-7bf260b81f4e-kube-api-access-68cqh\") on node \"crc\" DevicePath \"\"" Oct 04 05:43:02 crc kubenswrapper[4574]: I1004 05:43:02.675159 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" event={"ID":"e30ca915-ff10-4399-92f2-7bf260b81f4e","Type":"ContainerDied","Data":"35f7e3e37d134658d8602dacdd4f1e502cd7ac9fcb22570fdb3981f44cb4d4f3"} Oct 04 05:43:02 crc kubenswrapper[4574]: I1004 05:43:02.675198 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35f7e3e37d134658d8602dacdd4f1e502cd7ac9fcb22570fdb3981f44cb4d4f3" Oct 04 05:43:02 crc kubenswrapper[4574]: I1004 05:43:02.675220 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-ssvfd" Oct 04 05:43:06 crc kubenswrapper[4574]: I1004 05:43:06.954200 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-ssvfd"] Oct 04 05:43:06 crc kubenswrapper[4574]: I1004 05:43:06.961657 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-ssvfd"] Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.127173 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-nlnbb"] Oct 04 05:43:08 crc kubenswrapper[4574]: E1004 05:43:08.127610 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30ca915-ff10-4399-92f2-7bf260b81f4e" containerName="container-00" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.127626 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30ca915-ff10-4399-92f2-7bf260b81f4e" containerName="container-00" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.127824 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30ca915-ff10-4399-92f2-7bf260b81f4e" containerName="container-00" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.128443 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.208633 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15196bef-2933-4368-b2a6-6b57f91c4f86-host\") pod \"crc-debug-nlnbb\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.208881 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zm4j\" (UniqueName: \"kubernetes.io/projected/15196bef-2933-4368-b2a6-6b57f91c4f86-kube-api-access-4zm4j\") pod \"crc-debug-nlnbb\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.311214 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15196bef-2933-4368-b2a6-6b57f91c4f86-host\") pod \"crc-debug-nlnbb\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.311310 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15196bef-2933-4368-b2a6-6b57f91c4f86-host\") pod \"crc-debug-nlnbb\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.311415 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zm4j\" (UniqueName: \"kubernetes.io/projected/15196bef-2933-4368-b2a6-6b57f91c4f86-kube-api-access-4zm4j\") pod \"crc-debug-nlnbb\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.331458 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zm4j\" (UniqueName: \"kubernetes.io/projected/15196bef-2933-4368-b2a6-6b57f91c4f86-kube-api-access-4zm4j\") pod \"crc-debug-nlnbb\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.454178 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.725774 4574 generic.go:334] "Generic (PLEG): container finished" podID="15196bef-2933-4368-b2a6-6b57f91c4f86" containerID="9d2cdf9c5a78d6a5c0c5db79b265d3cae750c86f967ef5c72c8de09be746dc48" exitCode=0 Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.725854 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" event={"ID":"15196bef-2933-4368-b2a6-6b57f91c4f86","Type":"ContainerDied","Data":"9d2cdf9c5a78d6a5c0c5db79b265d3cae750c86f967ef5c72c8de09be746dc48"} Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.725893 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" event={"ID":"15196bef-2933-4368-b2a6-6b57f91c4f86","Type":"ContainerStarted","Data":"1de6e8cdb2faea249ed4c1aad943d87a22d74ff2778023e0efe3a010fda61a59"} Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.749731 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30ca915-ff10-4399-92f2-7bf260b81f4e" path="/var/lib/kubelet/pods/e30ca915-ff10-4399-92f2-7bf260b81f4e/volumes" Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.772399 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-nlnbb"] Oct 04 05:43:08 crc kubenswrapper[4574]: I1004 05:43:08.782151 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7hjf/crc-debug-nlnbb"] Oct 04 05:43:09 crc kubenswrapper[4574]: I1004 05:43:09.827853 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:09 crc kubenswrapper[4574]: I1004 05:43:09.940675 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15196bef-2933-4368-b2a6-6b57f91c4f86-host\") pod \"15196bef-2933-4368-b2a6-6b57f91c4f86\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " Oct 04 05:43:09 crc kubenswrapper[4574]: I1004 05:43:09.940751 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zm4j\" (UniqueName: \"kubernetes.io/projected/15196bef-2933-4368-b2a6-6b57f91c4f86-kube-api-access-4zm4j\") pod \"15196bef-2933-4368-b2a6-6b57f91c4f86\" (UID: \"15196bef-2933-4368-b2a6-6b57f91c4f86\") " Oct 04 05:43:09 crc kubenswrapper[4574]: I1004 05:43:09.940884 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15196bef-2933-4368-b2a6-6b57f91c4f86-host" (OuterVolumeSpecName: "host") pod "15196bef-2933-4368-b2a6-6b57f91c4f86" (UID: "15196bef-2933-4368-b2a6-6b57f91c4f86"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:43:09 crc kubenswrapper[4574]: I1004 05:43:09.941416 4574 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15196bef-2933-4368-b2a6-6b57f91c4f86-host\") on node \"crc\" DevicePath \"\"" Oct 04 05:43:09 crc kubenswrapper[4574]: I1004 05:43:09.946986 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15196bef-2933-4368-b2a6-6b57f91c4f86-kube-api-access-4zm4j" (OuterVolumeSpecName: "kube-api-access-4zm4j") pod "15196bef-2933-4368-b2a6-6b57f91c4f86" (UID: "15196bef-2933-4368-b2a6-6b57f91c4f86"). InnerVolumeSpecName "kube-api-access-4zm4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.043046 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zm4j\" (UniqueName: \"kubernetes.io/projected/15196bef-2933-4368-b2a6-6b57f91c4f86-kube-api-access-4zm4j\") on node \"crc\" DevicePath \"\"" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.554455 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-mgwq7_9c976366-a9b2-4720-a5ce-2aeffaf0dad2/kube-rbac-proxy/0.log" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.616103 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-mgwq7_9c976366-a9b2-4720-a5ce-2aeffaf0dad2/manager/0.log" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.743042 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/crc-debug-nlnbb" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.744359 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15196bef-2933-4368-b2a6-6b57f91c4f86" path="/var/lib/kubelet/pods/15196bef-2933-4368-b2a6-6b57f91c4f86/volumes" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.744952 4574 scope.go:117] "RemoveContainer" containerID="9d2cdf9c5a78d6a5c0c5db79b265d3cae750c86f967ef5c72c8de09be746dc48" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.806061 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-9t5xx_4552356b-ed71-465f-beb5-26c4a63dc81d/manager/0.log" Oct 04 05:43:10 crc kubenswrapper[4574]: I1004 05:43:10.829150 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-9t5xx_4552356b-ed71-465f-beb5-26c4a63dc81d/kube-rbac-proxy/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.049902 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/util/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.250590 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/pull/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.260904 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/util/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.300061 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/pull/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.465111 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/util/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.500832 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/pull/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.582760 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/extract/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.691265 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-qbzx8_39766d86-7ab2-42ca-b6ae-0e02eb871cc3/kube-rbac-proxy/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.770069 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-qbzx8_39766d86-7ab2-42ca-b6ae-0e02eb871cc3/manager/0.log" Oct 04 05:43:11 crc kubenswrapper[4574]: I1004 05:43:11.831056 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-pmvc8_52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd/kube-rbac-proxy/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.023427 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-pmvc8_52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd/manager/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.069460 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-mdh2j_d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7/kube-rbac-proxy/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.131161 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-mdh2j_d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7/manager/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.338376 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-96hsk_d552b4e4-9120-4d96-8615-fa6d68a71042/kube-rbac-proxy/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.403593 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-96hsk_d552b4e4-9120-4d96-8615-fa6d68a71042/manager/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.480443 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-gnpjd_e288039e-c6d3-4911-b284-1eb1cd2bccf2/kube-rbac-proxy/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.573859 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-xcjwv_d4f548d4-c2a0-4756-a55a-3d398b81d923/kube-rbac-proxy/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.705007 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-gnpjd_e288039e-c6d3-4911-b284-1eb1cd2bccf2/manager/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.716069 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-xcjwv_d4f548d4-c2a0-4756-a55a-3d398b81d923/manager/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.892527 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c777dc986-cvjnd_55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4/kube-rbac-proxy/0.log" Oct 04 05:43:12 crc kubenswrapper[4574]: I1004 05:43:12.989593 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c777dc986-cvjnd_55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4/manager/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.106826 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-88mfj_1edbf723-752f-416b-a922-12a73521d6f9/kube-rbac-proxy/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.188118 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j_85b1921d-1572-4aff-b002-2f31c2f270b4/kube-rbac-proxy/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.205180 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-88mfj_1edbf723-752f-416b-a922-12a73521d6f9/manager/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.316990 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j_85b1921d-1572-4aff-b002-2f31c2f270b4/manager/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.445877 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jt72t_90b04996-9e73-45c9-a03c-59e4bedf4ff4/kube-rbac-proxy/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.547435 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jt72t_90b04996-9e73-45c9-a03c-59e4bedf4ff4/manager/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.611587 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-t222j_95f9af94-f839-464f-8c6f-8928146b0d26/kube-rbac-proxy/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.813044 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-g2kpz_54443166-57a5-4e11-914c-d9cb2f3252d7/kube-rbac-proxy/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.832569 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-t222j_95f9af94-f839-464f-8c6f-8928146b0d26/manager/0.log" Oct 04 05:43:13 crc kubenswrapper[4574]: I1004 05:43:13.913776 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-g2kpz_54443166-57a5-4e11-914c-d9cb2f3252d7/manager/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.013151 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cz7492_f0b7b141-c133-4487-9ecb-fab0b12d82bb/manager/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.087572 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cz7492_f0b7b141-c133-4487-9ecb-fab0b12d82bb/kube-rbac-proxy/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.202604 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8fff4c848-5cvwf_9169e6bf-53d3-420e-bb99-b9d897653612/kube-rbac-proxy/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.475877 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76d7b4df79-hsvhp_6c734153-0dff-4669-ae00-bd91be75e4c6/kube-rbac-proxy/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.723147 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nk4xx_116021ce-1084-4c34-b4b8-9499015e58c0/registry-server/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.737706 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76d7b4df79-hsvhp_6c734153-0dff-4669-ae00-bd91be75e4c6/operator/0.log" Oct 04 05:43:14 crc kubenswrapper[4574]: I1004 05:43:14.864003 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-sxfrz_46bd489f-f708-4c7e-b697-39e9fd65a30e/kube-rbac-proxy/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.029280 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-b4fbd_28570522-1dff-475f-8ab0-963f4ac14534/kube-rbac-proxy/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.048157 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-sxfrz_46bd489f-f708-4c7e-b697-39e9fd65a30e/manager/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.203415 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-b4fbd_28570522-1dff-475f-8ab0-963f4ac14534/manager/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.284184 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2_a95cec28-a993-4f56-b540-18ad84c5bd2d/operator/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.468101 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-2fzvp_e227d829-9a02-40dd-b0c5-012a7d024253/kube-rbac-proxy/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.613660 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-2fzvp_e227d829-9a02-40dd-b0c5-012a7d024253/manager/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.659006 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8fff4c848-5cvwf_9169e6bf-53d3-420e-bb99-b9d897653612/manager/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.666776 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-hfm8z_60dfec70-f10c-4d73-9933-f2cb76124090/kube-rbac-proxy/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.770422 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-hfm8z_60dfec70-f10c-4d73-9933-f2cb76124090/manager/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.909972 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mhxlg_f87750ff-5d28-4658-b7d4-bc49bcb35886/kube-rbac-proxy/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.910889 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mhxlg_f87750ff-5d28-4658-b7d4-bc49bcb35886/manager/0.log" Oct 04 05:43:15 crc kubenswrapper[4574]: I1004 05:43:15.983875 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-llj5f_cb68cf9f-4ba2-410a-85f7-1db627311ff6/kube-rbac-proxy/0.log" Oct 04 05:43:16 crc kubenswrapper[4574]: I1004 05:43:16.096941 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-llj5f_cb68cf9f-4ba2-410a-85f7-1db627311ff6/manager/0.log" Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.405163 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.405562 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.405619 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.406477 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3c969c5c34210d2443513e1094b552fde70ff1b5cf8839e3294ccaf892d01bd"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.406543 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://d3c969c5c34210d2443513e1094b552fde70ff1b5cf8839e3294ccaf892d01bd" gracePeriod=600 Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.873314 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="d3c969c5c34210d2443513e1094b552fde70ff1b5cf8839e3294ccaf892d01bd" exitCode=0 Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.873528 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"d3c969c5c34210d2443513e1094b552fde70ff1b5cf8839e3294ccaf892d01bd"} Oct 04 05:43:19 crc kubenswrapper[4574]: I1004 05:43:19.873701 4574 scope.go:117] "RemoveContainer" containerID="f948cdd3b6855697bbe1af90667d04c7a186ef2190689302c3fb5ceb94f9e5e5" Oct 04 05:43:20 crc kubenswrapper[4574]: I1004 05:43:20.885164 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41"} Oct 04 05:43:32 crc kubenswrapper[4574]: I1004 05:43:32.243967 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x7jjx_d9424aaa-698a-43e0-ae1c-614cc4c538a6/control-plane-machine-set-operator/0.log" Oct 04 05:43:32 crc kubenswrapper[4574]: I1004 05:43:32.407513 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hkp92_da00c73e-dcd3-4fb7-aedd-77c84ea82855/kube-rbac-proxy/0.log" Oct 04 05:43:32 crc kubenswrapper[4574]: I1004 05:43:32.458974 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hkp92_da00c73e-dcd3-4fb7-aedd-77c84ea82855/machine-api-operator/0.log" Oct 04 05:43:44 crc kubenswrapper[4574]: I1004 05:43:44.414417 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qgjh7_58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e/cert-manager-controller/0.log" Oct 04 05:43:44 crc kubenswrapper[4574]: I1004 05:43:44.669330 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-f7cbs_25a7bfba-1bab-42d6-bb47-827aeeeefdbc/cert-manager-cainjector/0.log" Oct 04 05:43:44 crc kubenswrapper[4574]: I1004 05:43:44.709070 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mfxk5_cd556473-f56f-419c-b1b9-3a59dca5f00f/cert-manager-webhook/0.log" Oct 04 05:43:56 crc kubenswrapper[4574]: I1004 05:43:56.005781 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-vlq89_dc0445fe-9646-4248-a71b-c0dfff8b50f2/nmstate-console-plugin/0.log" Oct 04 05:43:56 crc kubenswrapper[4574]: I1004 05:43:56.182770 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-d95j7_88957498-0f2f-4fb7-baca-fc52a6abec78/nmstate-handler/0.log" Oct 04 05:43:56 crc kubenswrapper[4574]: I1004 05:43:56.276227 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6mt9v_7131c3ab-9443-4308-acef-460450511901/kube-rbac-proxy/0.log" Oct 04 05:43:56 crc kubenswrapper[4574]: I1004 05:43:56.291100 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6mt9v_7131c3ab-9443-4308-acef-460450511901/nmstate-metrics/0.log" Oct 04 05:43:56 crc kubenswrapper[4574]: I1004 05:43:56.420313 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-cxhlp_34ee31e2-d15b-4055-9e27-2ce2e9e43c28/nmstate-operator/0.log" Oct 04 05:43:56 crc kubenswrapper[4574]: I1004 05:43:56.548054 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-p9s5q_77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6/nmstate-webhook/0.log" Oct 04 05:44:09 crc kubenswrapper[4574]: I1004 05:44:09.811308 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-fl9dm_7de5a0bd-8082-40f2-9288-2c5417547a96/kube-rbac-proxy/0.log" Oct 04 05:44:09 crc kubenswrapper[4574]: I1004 05:44:09.910552 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-fl9dm_7de5a0bd-8082-40f2-9288-2c5417547a96/controller/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.059685 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.254776 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.257393 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.306163 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.316592 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.478063 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.491296 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.529815 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.539078 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.671849 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.712802 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.728664 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/controller/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.759611 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:44:10 crc kubenswrapper[4574]: I1004 05:44:10.910214 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/frr-metrics/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.017609 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/kube-rbac-proxy/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.059539 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/kube-rbac-proxy-frr/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.160849 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/reloader/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.375411 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-2lxv7_d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13/frr-k8s-webhook-server/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.731001 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7956f7d5bc-68jqm_cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3/manager/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.750673 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78dd4884c9-9rbjh_0b23a9bd-b984-4ec1-b18a-9617dad3a194/webhook-server/0.log" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.863338 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n69qm"] Oct 04 05:44:11 crc kubenswrapper[4574]: E1004 05:44:11.864464 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15196bef-2933-4368-b2a6-6b57f91c4f86" containerName="container-00" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.864560 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="15196bef-2933-4368-b2a6-6b57f91c4f86" containerName="container-00" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.864830 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="15196bef-2933-4368-b2a6-6b57f91c4f86" containerName="container-00" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.881803 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:11 crc kubenswrapper[4574]: I1004 05:44:11.915931 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69qm"] Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.016842 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-catalog-content\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.016886 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqcs\" (UniqueName: \"kubernetes.io/projected/046c9357-e4b7-45d2-bf0b-c14ba37e907b-kube-api-access-9bqcs\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.016974 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-utilities\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.029990 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/frr/0.log" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.118965 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-catalog-content\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.119027 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqcs\" (UniqueName: \"kubernetes.io/projected/046c9357-e4b7-45d2-bf0b-c14ba37e907b-kube-api-access-9bqcs\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.119153 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-utilities\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.119635 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-catalog-content\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.119644 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-utilities\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.151300 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqcs\" (UniqueName: \"kubernetes.io/projected/046c9357-e4b7-45d2-bf0b-c14ba37e907b-kube-api-access-9bqcs\") pod \"redhat-marketplace-n69qm\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.207210 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.570000 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pjq5j_d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb/kube-rbac-proxy/0.log" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.631635 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pjq5j_d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb/speaker/0.log" Oct 04 05:44:12 crc kubenswrapper[4574]: I1004 05:44:12.793026 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69qm"] Oct 04 05:44:13 crc kubenswrapper[4574]: I1004 05:44:13.384912 4574 generic.go:334] "Generic (PLEG): container finished" podID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerID="801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da" exitCode=0 Oct 04 05:44:13 crc kubenswrapper[4574]: I1004 05:44:13.384965 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69qm" event={"ID":"046c9357-e4b7-45d2-bf0b-c14ba37e907b","Type":"ContainerDied","Data":"801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da"} Oct 04 05:44:13 crc kubenswrapper[4574]: I1004 05:44:13.384997 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69qm" event={"ID":"046c9357-e4b7-45d2-bf0b-c14ba37e907b","Type":"ContainerStarted","Data":"fc86a28e3f4010b78e8d999b5321e350640a624ec89449f9f46471814887e569"} Oct 04 05:44:15 crc kubenswrapper[4574]: I1004 05:44:15.405301 4574 generic.go:334] "Generic (PLEG): container finished" podID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerID="b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc" exitCode=0 Oct 04 05:44:15 crc kubenswrapper[4574]: I1004 05:44:15.405431 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69qm" event={"ID":"046c9357-e4b7-45d2-bf0b-c14ba37e907b","Type":"ContainerDied","Data":"b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc"} Oct 04 05:44:16 crc kubenswrapper[4574]: I1004 05:44:16.416255 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69qm" event={"ID":"046c9357-e4b7-45d2-bf0b-c14ba37e907b","Type":"ContainerStarted","Data":"ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce"} Oct 04 05:44:16 crc kubenswrapper[4574]: I1004 05:44:16.441875 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n69qm" podStartSLOduration=3.032861275 podStartE2EDuration="5.441855719s" podCreationTimestamp="2025-10-04 05:44:11 +0000 UTC" firstStartedPulling="2025-10-04 05:44:13.386743221 +0000 UTC m=+3479.240886253" lastFinishedPulling="2025-10-04 05:44:15.795737655 +0000 UTC m=+3481.649880697" observedRunningTime="2025-10-04 05:44:16.435482705 +0000 UTC m=+3482.289625747" watchObservedRunningTime="2025-10-04 05:44:16.441855719 +0000 UTC m=+3482.295998761" Oct 04 05:44:22 crc kubenswrapper[4574]: I1004 05:44:22.208191 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:22 crc kubenswrapper[4574]: I1004 05:44:22.208757 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:22 crc kubenswrapper[4574]: I1004 05:44:22.267896 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:22 crc kubenswrapper[4574]: I1004 05:44:22.517671 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:22 crc kubenswrapper[4574]: I1004 05:44:22.575259 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69qm"] Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.483548 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n69qm" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="registry-server" containerID="cri-o://ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce" gracePeriod=2 Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.912598 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.967120 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bqcs\" (UniqueName: \"kubernetes.io/projected/046c9357-e4b7-45d2-bf0b-c14ba37e907b-kube-api-access-9bqcs\") pod \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.968268 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-utilities\") pod \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.968350 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-catalog-content\") pod \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\" (UID: \"046c9357-e4b7-45d2-bf0b-c14ba37e907b\") " Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.969133 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-utilities" (OuterVolumeSpecName: "utilities") pod "046c9357-e4b7-45d2-bf0b-c14ba37e907b" (UID: "046c9357-e4b7-45d2-bf0b-c14ba37e907b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.981795 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046c9357-e4b7-45d2-bf0b-c14ba37e907b-kube-api-access-9bqcs" (OuterVolumeSpecName: "kube-api-access-9bqcs") pod "046c9357-e4b7-45d2-bf0b-c14ba37e907b" (UID: "046c9357-e4b7-45d2-bf0b-c14ba37e907b"). InnerVolumeSpecName "kube-api-access-9bqcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:44:24 crc kubenswrapper[4574]: I1004 05:44:24.999345 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "046c9357-e4b7-45d2-bf0b-c14ba37e907b" (UID: "046c9357-e4b7-45d2-bf0b-c14ba37e907b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.070850 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bqcs\" (UniqueName: \"kubernetes.io/projected/046c9357-e4b7-45d2-bf0b-c14ba37e907b-kube-api-access-9bqcs\") on node \"crc\" DevicePath \"\"" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.071105 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.071212 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046c9357-e4b7-45d2-bf0b-c14ba37e907b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.492907 4574 generic.go:334] "Generic (PLEG): container finished" podID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerID="ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce" exitCode=0 Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.492967 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69qm" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.492983 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69qm" event={"ID":"046c9357-e4b7-45d2-bf0b-c14ba37e907b","Type":"ContainerDied","Data":"ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce"} Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.493282 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69qm" event={"ID":"046c9357-e4b7-45d2-bf0b-c14ba37e907b","Type":"ContainerDied","Data":"fc86a28e3f4010b78e8d999b5321e350640a624ec89449f9f46471814887e569"} Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.493304 4574 scope.go:117] "RemoveContainer" containerID="ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.518167 4574 scope.go:117] "RemoveContainer" containerID="b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.535724 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69qm"] Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.552243 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69qm"] Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.562070 4574 scope.go:117] "RemoveContainer" containerID="801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.598574 4574 scope.go:117] "RemoveContainer" containerID="ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce" Oct 04 05:44:25 crc kubenswrapper[4574]: E1004 05:44:25.600617 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce\": container with ID starting with ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce not found: ID does not exist" containerID="ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.600731 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce"} err="failed to get container status \"ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce\": rpc error: code = NotFound desc = could not find container \"ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce\": container with ID starting with ff1cafbaca1e7a72342dff3e65162d46421befb4240177c4c3d218bc2d5bfdce not found: ID does not exist" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.600777 4574 scope.go:117] "RemoveContainer" containerID="b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc" Oct 04 05:44:25 crc kubenswrapper[4574]: E1004 05:44:25.607982 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc\": container with ID starting with b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc not found: ID does not exist" containerID="b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.608036 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc"} err="failed to get container status \"b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc\": rpc error: code = NotFound desc = could not find container \"b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc\": container with ID starting with b156fa97d037d1a908e73f41934c5085ff03f7fe4e6ba563edf0dd804309cddc not found: ID does not exist" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.608074 4574 scope.go:117] "RemoveContainer" containerID="801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da" Oct 04 05:44:25 crc kubenswrapper[4574]: E1004 05:44:25.611405 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da\": container with ID starting with 801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da not found: ID does not exist" containerID="801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da" Oct 04 05:44:25 crc kubenswrapper[4574]: I1004 05:44:25.611458 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da"} err="failed to get container status \"801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da\": rpc error: code = NotFound desc = could not find container \"801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da\": container with ID starting with 801b7dec06a2cb80fd8373d191541a43b442035d5fdf85f68e37706b601cc7da not found: ID does not exist" Oct 04 05:44:26 crc kubenswrapper[4574]: I1004 05:44:26.746987 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" path="/var/lib/kubelet/pods/046c9357-e4b7-45d2-bf0b-c14ba37e907b/volumes" Oct 04 05:44:26 crc kubenswrapper[4574]: I1004 05:44:26.864920 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/util/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.070445 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/util/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.118868 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/pull/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.129146 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/pull/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.349150 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/pull/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.407433 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/util/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.452379 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/extract/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.613569 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-utilities/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.754108 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-utilities/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.766858 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-content/0.log" Oct 04 05:44:27 crc kubenswrapper[4574]: I1004 05:44:27.843000 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-content/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.068127 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-content/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.171519 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-utilities/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.438968 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-utilities/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.626447 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/registry-server/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.693645 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-utilities/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.733678 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-content/0.log" Oct 04 05:44:28 crc kubenswrapper[4574]: I1004 05:44:28.786386 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-content/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.122269 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-utilities/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.135847 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-content/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.556691 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/registry-server/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.600435 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/util/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.809302 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/util/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.812607 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/pull/0.log" Oct 04 05:44:29 crc kubenswrapper[4574]: I1004 05:44:29.875422 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/pull/0.log" Oct 04 05:44:30 crc kubenswrapper[4574]: I1004 05:44:30.729779 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/extract/0.log" Oct 04 05:44:30 crc kubenswrapper[4574]: I1004 05:44:30.751122 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/pull/0.log" Oct 04 05:44:30 crc kubenswrapper[4574]: I1004 05:44:30.761749 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/util/0.log" Oct 04 05:44:30 crc kubenswrapper[4574]: I1004 05:44:30.965069 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-28wcm_40f47671-d6bd-402e-8003-3688245aa0ed/marketplace-operator/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.082078 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-utilities/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.249417 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-content/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.282084 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-utilities/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.296703 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-content/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.502093 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-content/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.612339 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-utilities/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.736420 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/registry-server/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.766552 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-utilities/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.912211 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-utilities/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.973635 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-content/0.log" Oct 04 05:44:31 crc kubenswrapper[4574]: I1004 05:44:31.973808 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-content/0.log" Oct 04 05:44:32 crc kubenswrapper[4574]: I1004 05:44:32.162268 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-utilities/0.log" Oct 04 05:44:32 crc kubenswrapper[4574]: I1004 05:44:32.206495 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-content/0.log" Oct 04 05:44:32 crc kubenswrapper[4574]: I1004 05:44:32.664608 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/registry-server/0.log" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.176910 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74"] Oct 04 05:45:00 crc kubenswrapper[4574]: E1004 05:45:00.177985 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="registry-server" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.178002 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="registry-server" Oct 04 05:45:00 crc kubenswrapper[4574]: E1004 05:45:00.178021 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="extract-content" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.178031 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="extract-content" Oct 04 05:45:00 crc kubenswrapper[4574]: E1004 05:45:00.178062 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="extract-utilities" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.178071 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="extract-utilities" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.178403 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="046c9357-e4b7-45d2-bf0b-c14ba37e907b" containerName="registry-server" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.179176 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.182113 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.182966 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.199171 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74"] Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.362862 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5280493c-bce2-4994-97ea-f7e55593e4f5-config-volume\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.363683 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jmh\" (UniqueName: \"kubernetes.io/projected/5280493c-bce2-4994-97ea-f7e55593e4f5-kube-api-access-42jmh\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.363726 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5280493c-bce2-4994-97ea-f7e55593e4f5-secret-volume\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.465691 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42jmh\" (UniqueName: \"kubernetes.io/projected/5280493c-bce2-4994-97ea-f7e55593e4f5-kube-api-access-42jmh\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.465757 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5280493c-bce2-4994-97ea-f7e55593e4f5-secret-volume\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.465895 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5280493c-bce2-4994-97ea-f7e55593e4f5-config-volume\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.466861 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5280493c-bce2-4994-97ea-f7e55593e4f5-config-volume\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.488185 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5280493c-bce2-4994-97ea-f7e55593e4f5-secret-volume\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.493965 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jmh\" (UniqueName: \"kubernetes.io/projected/5280493c-bce2-4994-97ea-f7e55593e4f5-kube-api-access-42jmh\") pod \"collect-profiles-29325945-vqq74\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:00 crc kubenswrapper[4574]: I1004 05:45:00.515891 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:01 crc kubenswrapper[4574]: I1004 05:45:01.058059 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74"] Oct 04 05:45:01 crc kubenswrapper[4574]: W1004 05:45:01.074520 4574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5280493c_bce2_4994_97ea_f7e55593e4f5.slice/crio-70b1b49308a0a53317ebce1a244748a154b562c0bf95398bb3b5131a1ad3ac95 WatchSource:0}: Error finding container 70b1b49308a0a53317ebce1a244748a154b562c0bf95398bb3b5131a1ad3ac95: Status 404 returned error can't find the container with id 70b1b49308a0a53317ebce1a244748a154b562c0bf95398bb3b5131a1ad3ac95 Oct 04 05:45:01 crc kubenswrapper[4574]: I1004 05:45:01.833993 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" event={"ID":"5280493c-bce2-4994-97ea-f7e55593e4f5","Type":"ContainerStarted","Data":"33cffd9df973d6c76e343e0ee5a2fe5a6e861dca149fbe52d07d6cf888b92708"} Oct 04 05:45:01 crc kubenswrapper[4574]: I1004 05:45:01.834359 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" event={"ID":"5280493c-bce2-4994-97ea-f7e55593e4f5","Type":"ContainerStarted","Data":"70b1b49308a0a53317ebce1a244748a154b562c0bf95398bb3b5131a1ad3ac95"} Oct 04 05:45:01 crc kubenswrapper[4574]: I1004 05:45:01.858970 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" podStartSLOduration=1.858947184 podStartE2EDuration="1.858947184s" podCreationTimestamp="2025-10-04 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:45:01.854831405 +0000 UTC m=+3527.708974457" watchObservedRunningTime="2025-10-04 05:45:01.858947184 +0000 UTC m=+3527.713090226" Oct 04 05:45:02 crc kubenswrapper[4574]: I1004 05:45:02.844988 4574 generic.go:334] "Generic (PLEG): container finished" podID="5280493c-bce2-4994-97ea-f7e55593e4f5" containerID="33cffd9df973d6c76e343e0ee5a2fe5a6e861dca149fbe52d07d6cf888b92708" exitCode=0 Oct 04 05:45:02 crc kubenswrapper[4574]: I1004 05:45:02.845164 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" event={"ID":"5280493c-bce2-4994-97ea-f7e55593e4f5","Type":"ContainerDied","Data":"33cffd9df973d6c76e343e0ee5a2fe5a6e861dca149fbe52d07d6cf888b92708"} Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.194564 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.348199 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5280493c-bce2-4994-97ea-f7e55593e4f5-secret-volume\") pod \"5280493c-bce2-4994-97ea-f7e55593e4f5\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.348377 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42jmh\" (UniqueName: \"kubernetes.io/projected/5280493c-bce2-4994-97ea-f7e55593e4f5-kube-api-access-42jmh\") pod \"5280493c-bce2-4994-97ea-f7e55593e4f5\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.348413 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5280493c-bce2-4994-97ea-f7e55593e4f5-config-volume\") pod \"5280493c-bce2-4994-97ea-f7e55593e4f5\" (UID: \"5280493c-bce2-4994-97ea-f7e55593e4f5\") " Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.349310 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5280493c-bce2-4994-97ea-f7e55593e4f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "5280493c-bce2-4994-97ea-f7e55593e4f5" (UID: "5280493c-bce2-4994-97ea-f7e55593e4f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.354634 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5280493c-bce2-4994-97ea-f7e55593e4f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5280493c-bce2-4994-97ea-f7e55593e4f5" (UID: "5280493c-bce2-4994-97ea-f7e55593e4f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.370577 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5280493c-bce2-4994-97ea-f7e55593e4f5-kube-api-access-42jmh" (OuterVolumeSpecName: "kube-api-access-42jmh") pod "5280493c-bce2-4994-97ea-f7e55593e4f5" (UID: "5280493c-bce2-4994-97ea-f7e55593e4f5"). InnerVolumeSpecName "kube-api-access-42jmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.451709 4574 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5280493c-bce2-4994-97ea-f7e55593e4f5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.451960 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42jmh\" (UniqueName: \"kubernetes.io/projected/5280493c-bce2-4994-97ea-f7e55593e4f5-kube-api-access-42jmh\") on node \"crc\" DevicePath \"\"" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.452074 4574 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5280493c-bce2-4994-97ea-f7e55593e4f5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.863852 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" event={"ID":"5280493c-bce2-4994-97ea-f7e55593e4f5","Type":"ContainerDied","Data":"70b1b49308a0a53317ebce1a244748a154b562c0bf95398bb3b5131a1ad3ac95"} Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.864129 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b1b49308a0a53317ebce1a244748a154b562c0bf95398bb3b5131a1ad3ac95" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.863901 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325945-vqq74" Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.944390 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft"] Oct 04 05:45:04 crc kubenswrapper[4574]: I1004 05:45:04.951906 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325900-vtgft"] Oct 04 05:45:06 crc kubenswrapper[4574]: I1004 05:45:06.756630 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8306ee34-f88b-417d-b2e1-efa57667fdfd" path="/var/lib/kubelet/pods/8306ee34-f88b-417d-b2e1-efa57667fdfd/volumes" Oct 04 05:45:19 crc kubenswrapper[4574]: I1004 05:45:19.404525 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:45:19 crc kubenswrapper[4574]: I1004 05:45:19.404932 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:45:25 crc kubenswrapper[4574]: I1004 05:45:25.147963 4574 scope.go:117] "RemoveContainer" containerID="65dc39243397b106ef5cbfffa430b62a1dd2932c69495bdd24f846963625d012" Oct 04 05:45:49 crc kubenswrapper[4574]: I1004 05:45:49.404567 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:45:49 crc kubenswrapper[4574]: I1004 05:45:49.405190 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:46:19 crc kubenswrapper[4574]: I1004 05:46:19.404704 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:46:19 crc kubenswrapper[4574]: I1004 05:46:19.405718 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:46:19 crc kubenswrapper[4574]: I1004 05:46:19.405809 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:46:19 crc kubenswrapper[4574]: I1004 05:46:19.407744 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:46:19 crc kubenswrapper[4574]: I1004 05:46:19.407864 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" gracePeriod=600 Oct 04 05:46:20 crc kubenswrapper[4574]: E1004 05:46:20.074938 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:46:20 crc kubenswrapper[4574]: I1004 05:46:20.573490 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" exitCode=0 Oct 04 05:46:20 crc kubenswrapper[4574]: I1004 05:46:20.573588 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41"} Oct 04 05:46:20 crc kubenswrapper[4574]: I1004 05:46:20.574442 4574 scope.go:117] "RemoveContainer" containerID="d3c969c5c34210d2443513e1094b552fde70ff1b5cf8839e3294ccaf892d01bd" Oct 04 05:46:20 crc kubenswrapper[4574]: I1004 05:46:20.583733 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:46:20 crc kubenswrapper[4574]: E1004 05:46:20.584008 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:46:32 crc kubenswrapper[4574]: I1004 05:46:32.733160 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:46:32 crc kubenswrapper[4574]: E1004 05:46:32.734131 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:46:43 crc kubenswrapper[4574]: I1004 05:46:43.733577 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:46:43 crc kubenswrapper[4574]: E1004 05:46:43.734362 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:46:44 crc kubenswrapper[4574]: I1004 05:46:44.776069 4574 generic.go:334] "Generic (PLEG): container finished" podID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerID="49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4" exitCode=0 Oct 04 05:46:44 crc kubenswrapper[4574]: I1004 05:46:44.776160 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v7hjf/must-gather-25mpc" event={"ID":"ccbf02b4-3afc-447e-a4e1-6b784ebff333","Type":"ContainerDied","Data":"49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4"} Oct 04 05:46:44 crc kubenswrapper[4574]: I1004 05:46:44.778459 4574 scope.go:117] "RemoveContainer" containerID="49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4" Oct 04 05:46:45 crc kubenswrapper[4574]: I1004 05:46:45.294476 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v7hjf_must-gather-25mpc_ccbf02b4-3afc-447e-a4e1-6b784ebff333/gather/0.log" Oct 04 05:46:53 crc kubenswrapper[4574]: I1004 05:46:53.984601 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v7hjf/must-gather-25mpc"] Oct 04 05:46:53 crc kubenswrapper[4574]: I1004 05:46:53.985426 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v7hjf/must-gather-25mpc" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="copy" containerID="cri-o://fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d" gracePeriod=2 Oct 04 05:46:53 crc kubenswrapper[4574]: I1004 05:46:53.993163 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v7hjf/must-gather-25mpc"] Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.491357 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v7hjf_must-gather-25mpc_ccbf02b4-3afc-447e-a4e1-6b784ebff333/copy/0.log" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.493265 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.604219 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccbf02b4-3afc-447e-a4e1-6b784ebff333-must-gather-output\") pod \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.604482 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqgqq\" (UniqueName: \"kubernetes.io/projected/ccbf02b4-3afc-447e-a4e1-6b784ebff333-kube-api-access-wqgqq\") pod \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\" (UID: \"ccbf02b4-3afc-447e-a4e1-6b784ebff333\") " Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.610035 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbf02b4-3afc-447e-a4e1-6b784ebff333-kube-api-access-wqgqq" (OuterVolumeSpecName: "kube-api-access-wqgqq") pod "ccbf02b4-3afc-447e-a4e1-6b784ebff333" (UID: "ccbf02b4-3afc-447e-a4e1-6b784ebff333"). InnerVolumeSpecName "kube-api-access-wqgqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.707115 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqgqq\" (UniqueName: \"kubernetes.io/projected/ccbf02b4-3afc-447e-a4e1-6b784ebff333-kube-api-access-wqgqq\") on node \"crc\" DevicePath \"\"" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.772604 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccbf02b4-3afc-447e-a4e1-6b784ebff333-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ccbf02b4-3afc-447e-a4e1-6b784ebff333" (UID: "ccbf02b4-3afc-447e-a4e1-6b784ebff333"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.808972 4574 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccbf02b4-3afc-447e-a4e1-6b784ebff333-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.886997 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v7hjf_must-gather-25mpc_ccbf02b4-3afc-447e-a4e1-6b784ebff333/copy/0.log" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.887400 4574 generic.go:334] "Generic (PLEG): container finished" podID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerID="fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d" exitCode=143 Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.887485 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v7hjf/must-gather-25mpc" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.887480 4574 scope.go:117] "RemoveContainer" containerID="fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.909845 4574 scope.go:117] "RemoveContainer" containerID="49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.968521 4574 scope.go:117] "RemoveContainer" containerID="fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d" Oct 04 05:46:54 crc kubenswrapper[4574]: E1004 05:46:54.969095 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d\": container with ID starting with fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d not found: ID does not exist" containerID="fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.969139 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d"} err="failed to get container status \"fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d\": rpc error: code = NotFound desc = could not find container \"fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d\": container with ID starting with fddd720ff3fc8f8202d016c22def6e3ec0cf887c97384fef613bd9f7d559534d not found: ID does not exist" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.969167 4574 scope.go:117] "RemoveContainer" containerID="49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4" Oct 04 05:46:54 crc kubenswrapper[4574]: E1004 05:46:54.969523 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4\": container with ID starting with 49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4 not found: ID does not exist" containerID="49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4" Oct 04 05:46:54 crc kubenswrapper[4574]: I1004 05:46:54.969567 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4"} err="failed to get container status \"49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4\": rpc error: code = NotFound desc = could not find container \"49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4\": container with ID starting with 49cb249e8649018c731348ebbda45e8b82817b692b03d3054b81cb2d8c5ed2b4 not found: ID does not exist" Oct 04 05:46:56 crc kubenswrapper[4574]: I1004 05:46:56.742850 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" path="/var/lib/kubelet/pods/ccbf02b4-3afc-447e-a4e1-6b784ebff333/volumes" Oct 04 05:46:57 crc kubenswrapper[4574]: I1004 05:46:57.733500 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:46:57 crc kubenswrapper[4574]: E1004 05:46:57.734002 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:47:10 crc kubenswrapper[4574]: I1004 05:47:10.733076 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:47:10 crc kubenswrapper[4574]: E1004 05:47:10.734605 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.739803 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmxsr/must-gather-6kx54"] Oct 04 05:47:19 crc kubenswrapper[4574]: E1004 05:47:19.740814 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="copy" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.740828 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="copy" Oct 04 05:47:19 crc kubenswrapper[4574]: E1004 05:47:19.740847 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="gather" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.740854 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="gather" Oct 04 05:47:19 crc kubenswrapper[4574]: E1004 05:47:19.740885 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5280493c-bce2-4994-97ea-f7e55593e4f5" containerName="collect-profiles" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.740895 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5280493c-bce2-4994-97ea-f7e55593e4f5" containerName="collect-profiles" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.741104 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="gather" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.741136 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="5280493c-bce2-4994-97ea-f7e55593e4f5" containerName="collect-profiles" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.741147 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbf02b4-3afc-447e-a4e1-6b784ebff333" containerName="copy" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.742363 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.748250 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xmxsr"/"openshift-service-ca.crt" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.748330 4574 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xmxsr"/"kube-root-ca.crt" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.755789 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xmxsr/must-gather-6kx54"] Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.776332 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-must-gather-output\") pod \"must-gather-6kx54\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.776417 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj4kl\" (UniqueName: \"kubernetes.io/projected/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-kube-api-access-gj4kl\") pod \"must-gather-6kx54\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.878063 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-must-gather-output\") pod \"must-gather-6kx54\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.878125 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj4kl\" (UniqueName: \"kubernetes.io/projected/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-kube-api-access-gj4kl\") pod \"must-gather-6kx54\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.878615 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-must-gather-output\") pod \"must-gather-6kx54\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:19 crc kubenswrapper[4574]: I1004 05:47:19.897935 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj4kl\" (UniqueName: \"kubernetes.io/projected/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-kube-api-access-gj4kl\") pod \"must-gather-6kx54\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:20 crc kubenswrapper[4574]: I1004 05:47:20.062550 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:47:20 crc kubenswrapper[4574]: I1004 05:47:20.591715 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xmxsr/must-gather-6kx54"] Oct 04 05:47:21 crc kubenswrapper[4574]: I1004 05:47:21.111573 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/must-gather-6kx54" event={"ID":"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65","Type":"ContainerStarted","Data":"fb4f67df1549d103da45f2f05b466be2878af655b2fe5fc6ab7dd05bed02e051"} Oct 04 05:47:21 crc kubenswrapper[4574]: I1004 05:47:21.112075 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/must-gather-6kx54" event={"ID":"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65","Type":"ContainerStarted","Data":"44dcea32fa4920978ef09aecf8b254ff8569a996b93898c601b0e84ddedf1b84"} Oct 04 05:47:21 crc kubenswrapper[4574]: I1004 05:47:21.740175 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:47:21 crc kubenswrapper[4574]: E1004 05:47:21.740457 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:47:22 crc kubenswrapper[4574]: I1004 05:47:22.120515 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/must-gather-6kx54" event={"ID":"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65","Type":"ContainerStarted","Data":"3ac7d10566717c18113c4e4215038853490f021e92ce89082b4652f5ddbfbd21"} Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.725001 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmxsr/must-gather-6kx54" podStartSLOduration=5.724920952 podStartE2EDuration="5.724920952s" podCreationTimestamp="2025-10-04 05:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:47:22.14056452 +0000 UTC m=+3667.994707562" watchObservedRunningTime="2025-10-04 05:47:24.724920952 +0000 UTC m=+3670.579063994" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.748700 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-thlps"] Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.749914 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.752898 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xmxsr"/"default-dockercfg-mgss7" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.870922 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/379c7c79-8b4c-4b27-aba0-418f129cbaa4-host\") pod \"crc-debug-thlps\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.870976 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh6bm\" (UniqueName: \"kubernetes.io/projected/379c7c79-8b4c-4b27-aba0-418f129cbaa4-kube-api-access-dh6bm\") pod \"crc-debug-thlps\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.972954 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/379c7c79-8b4c-4b27-aba0-418f129cbaa4-host\") pod \"crc-debug-thlps\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.973022 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh6bm\" (UniqueName: \"kubernetes.io/projected/379c7c79-8b4c-4b27-aba0-418f129cbaa4-kube-api-access-dh6bm\") pod \"crc-debug-thlps\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:24 crc kubenswrapper[4574]: I1004 05:47:24.973094 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/379c7c79-8b4c-4b27-aba0-418f129cbaa4-host\") pod \"crc-debug-thlps\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:25 crc kubenswrapper[4574]: I1004 05:47:25.011019 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh6bm\" (UniqueName: \"kubernetes.io/projected/379c7c79-8b4c-4b27-aba0-418f129cbaa4-kube-api-access-dh6bm\") pod \"crc-debug-thlps\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:25 crc kubenswrapper[4574]: I1004 05:47:25.071651 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:47:25 crc kubenswrapper[4574]: I1004 05:47:25.237486 4574 scope.go:117] "RemoveContainer" containerID="c9e099becdd9aa247bc8ef30d0f1268a24165b125aea1a4811b9cbb815db82dd" Oct 04 05:47:26 crc kubenswrapper[4574]: I1004 05:47:26.165338 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-thlps" event={"ID":"379c7c79-8b4c-4b27-aba0-418f129cbaa4","Type":"ContainerStarted","Data":"0bd08e80c06896a729fb91f910c9174a8ddea1243dc924a34ec2605a3a873329"} Oct 04 05:47:26 crc kubenswrapper[4574]: I1004 05:47:26.165892 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-thlps" event={"ID":"379c7c79-8b4c-4b27-aba0-418f129cbaa4","Type":"ContainerStarted","Data":"eac791112b79ecb120e4caa4d874ae05792606cff8c6b049b86bf728f42a73f8"} Oct 04 05:47:26 crc kubenswrapper[4574]: I1004 05:47:26.217329 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmxsr/crc-debug-thlps" podStartSLOduration=2.217299222 podStartE2EDuration="2.217299222s" podCreationTimestamp="2025-10-04 05:47:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:47:26.202812474 +0000 UTC m=+3672.056955516" watchObservedRunningTime="2025-10-04 05:47:26.217299222 +0000 UTC m=+3672.071442264" Oct 04 05:47:35 crc kubenswrapper[4574]: I1004 05:47:35.734181 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:47:35 crc kubenswrapper[4574]: E1004 05:47:35.734988 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:47:49 crc kubenswrapper[4574]: I1004 05:47:49.733276 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:47:49 crc kubenswrapper[4574]: E1004 05:47:49.734166 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:48:04 crc kubenswrapper[4574]: I1004 05:48:04.742414 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:48:04 crc kubenswrapper[4574]: E1004 05:48:04.746894 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:48:18 crc kubenswrapper[4574]: I1004 05:48:18.733541 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:48:18 crc kubenswrapper[4574]: E1004 05:48:18.734341 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:48:29 crc kubenswrapper[4574]: I1004 05:48:29.733526 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:48:29 crc kubenswrapper[4574]: E1004 05:48:29.734366 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.162602 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-766d778598-9bz6b_c224adb6-7a04-4bd4-bc6a-d8c484c8710e/barbican-api/0.log" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.311221 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-766d778598-9bz6b_c224adb6-7a04-4bd4-bc6a-d8c484c8710e/barbican-api-log/0.log" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.382026 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5475848bb4-qk59c_5438cd90-23bc-4da2-8856-519b7656f8ff/barbican-keystone-listener/0.log" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.719144 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5475848bb4-qk59c_5438cd90-23bc-4da2-8856-519b7656f8ff/barbican-keystone-listener-log/0.log" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.728630 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86595bb85-v84cq_857fe45e-27ff-44ef-b58c-9e1278946927/barbican-worker/0.log" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.733017 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:48:43 crc kubenswrapper[4574]: E1004 05:48:43.733261 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:48:43 crc kubenswrapper[4574]: I1004 05:48:43.927122 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86595bb85-v84cq_857fe45e-27ff-44ef-b58c-9e1278946927/barbican-worker-log/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.020503 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-trfc2_1e9631eb-d051-4087-81eb-2f33ea4dd993/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.286223 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/ceilometer-central-agent/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.393343 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/ceilometer-notification-agent/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.503618 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/sg-core/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.605137 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1abcd2f9-3753-4b7e-a5a3-0784ec9518f1/proxy-httpd/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.817147 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_38984f83-1657-45b8-bcd4-448c2306ea86/cinder-api/0.log" Oct 04 05:48:44 crc kubenswrapper[4574]: I1004 05:48:44.849512 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_38984f83-1657-45b8-bcd4-448c2306ea86/cinder-api-log/0.log" Oct 04 05:48:45 crc kubenswrapper[4574]: I1004 05:48:45.051627 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_161f98e1-5520-4148-8565-05394e7e8daf/cinder-scheduler/0.log" Oct 04 05:48:45 crc kubenswrapper[4574]: I1004 05:48:45.337365 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_161f98e1-5520-4148-8565-05394e7e8daf/probe/0.log" Oct 04 05:48:45 crc kubenswrapper[4574]: I1004 05:48:45.502794 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vj82x_75cb1602-ada9-4442-be91-3fa85a464d5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:45 crc kubenswrapper[4574]: I1004 05:48:45.712293 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-65256_da9c2287-2920-4152-bf57-7eb8effbea81/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:45 crc kubenswrapper[4574]: I1004 05:48:45.887937 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-grdlr_9fc7c75a-28ce-4bc4-9d47-e637a5a0f1ad/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:46 crc kubenswrapper[4574]: I1004 05:48:46.105414 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44rq7_3f536491-9237-4de1-b43d-2ffefcf26eb8/init/0.log" Oct 04 05:48:46 crc kubenswrapper[4574]: I1004 05:48:46.308032 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44rq7_3f536491-9237-4de1-b43d-2ffefcf26eb8/init/0.log" Oct 04 05:48:46 crc kubenswrapper[4574]: I1004 05:48:46.579195 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7fd9f947b7-44rq7_3f536491-9237-4de1-b43d-2ffefcf26eb8/dnsmasq-dns/0.log" Oct 04 05:48:46 crc kubenswrapper[4574]: I1004 05:48:46.721420 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-qwwfs_b8508613-3769-4000-9037-bce43bf206bb/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:46 crc kubenswrapper[4574]: I1004 05:48:46.970695 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5f5514e6-eceb-4683-9633-684cc13d5458/glance-httpd/0.log" Oct 04 05:48:47 crc kubenswrapper[4574]: I1004 05:48:47.015737 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5f5514e6-eceb-4683-9633-684cc13d5458/glance-log/0.log" Oct 04 05:48:47 crc kubenswrapper[4574]: I1004 05:48:47.293624 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7/glance-log/0.log" Oct 04 05:48:47 crc kubenswrapper[4574]: I1004 05:48:47.323516 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_18a15f43-fb88-4fd3-8215-4e0bdc5b8aa7/glance-httpd/0.log" Oct 04 05:48:47 crc kubenswrapper[4574]: I1004 05:48:47.626347 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57bfb4d496-nv6hv_85281a42-f9ab-4302-9fe9-4e742075530f/horizon/2.log" Oct 04 05:48:47 crc kubenswrapper[4574]: I1004 05:48:47.821330 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57bfb4d496-nv6hv_85281a42-f9ab-4302-9fe9-4e742075530f/horizon/1.log" Oct 04 05:48:48 crc kubenswrapper[4574]: I1004 05:48:48.061887 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j6n6s_5b869cbb-6227-4391-9faf-2565fc5a4acd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:48 crc kubenswrapper[4574]: I1004 05:48:48.115373 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kvnsk_39389db3-7317-49b0-af09-e9459d02c5e7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:48 crc kubenswrapper[4574]: I1004 05:48:48.275608 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57bfb4d496-nv6hv_85281a42-f9ab-4302-9fe9-4e742075530f/horizon-log/0.log" Oct 04 05:48:48 crc kubenswrapper[4574]: I1004 05:48:48.679721 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78786b8bfb-qgltl_1e4a50fe-8cee-4243-a215-9c82e358ea30/keystone-api/0.log" Oct 04 05:48:48 crc kubenswrapper[4574]: I1004 05:48:48.699964 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cd47459a-7171-4d7e-8f65-20a2936ce760/kube-state-metrics/0.log" Oct 04 05:48:48 crc kubenswrapper[4574]: I1004 05:48:48.846329 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qhprs_7f92a088-639a-4112-910b-bb2a76600bac/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:49 crc kubenswrapper[4574]: I1004 05:48:49.194805 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54746bc5fc-22pbj_e736cc6e-edb6-4fad-8687-6c4e2a85d0a0/neutron-api/0.log" Oct 04 05:48:50 crc kubenswrapper[4574]: I1004 05:48:50.067838 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54746bc5fc-22pbj_e736cc6e-edb6-4fad-8687-6c4e2a85d0a0/neutron-httpd/0.log" Oct 04 05:48:51 crc kubenswrapper[4574]: I1004 05:48:51.075417 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8kbwt_658de4d9-d56d-45fd-b0bc-781bbbb30a5e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:51 crc kubenswrapper[4574]: I1004 05:48:51.625013 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_97365d9d-d7a3-42b9-8131-54dea698f6f8/nova-api-api/0.log" Oct 04 05:48:51 crc kubenswrapper[4574]: I1004 05:48:51.881095 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_97365d9d-d7a3-42b9-8131-54dea698f6f8/nova-api-log/0.log" Oct 04 05:48:52 crc kubenswrapper[4574]: I1004 05:48:52.573423 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8377c768-d10d-49d6-b43f-b1aeedcdeae6/nova-cell0-conductor-conductor/0.log" Oct 04 05:48:53 crc kubenswrapper[4574]: I1004 05:48:53.495552 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f6ae6da5-dd08-46b4-94cf-589b9c4f5139/nova-cell1-conductor-conductor/0.log" Oct 04 05:48:54 crc kubenswrapper[4574]: I1004 05:48:54.323970 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_43dfa220-f267-43c2-9b28-4dc23a4a3eeb/nova-cell1-novncproxy-novncproxy/0.log" Oct 04 05:48:54 crc kubenswrapper[4574]: I1004 05:48:54.611599 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-th9xm_d85707e9-6bd8-4f36-b3c0-d8a0ccc88811/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:54 crc kubenswrapper[4574]: I1004 05:48:54.987728 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7e8c70bd-bcf3-4379-a026-5a52411a56ab/nova-metadata-log/0.log" Oct 04 05:48:55 crc kubenswrapper[4574]: I1004 05:48:55.497696 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_48cf5b3d-b1a3-4c9c-b2bb-82e54ca8519c/nova-scheduler-scheduler/0.log" Oct 04 05:48:55 crc kubenswrapper[4574]: I1004 05:48:55.803604 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f275e3ec-6c93-412b-875c-65b03a785dc0/mysql-bootstrap/0.log" Oct 04 05:48:55 crc kubenswrapper[4574]: I1004 05:48:55.993377 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f275e3ec-6c93-412b-875c-65b03a785dc0/mysql-bootstrap/0.log" Oct 04 05:48:56 crc kubenswrapper[4574]: I1004 05:48:56.119839 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f275e3ec-6c93-412b-875c-65b03a785dc0/galera/0.log" Oct 04 05:48:56 crc kubenswrapper[4574]: I1004 05:48:56.302554 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7e8c70bd-bcf3-4379-a026-5a52411a56ab/nova-metadata-metadata/0.log" Oct 04 05:48:56 crc kubenswrapper[4574]: I1004 05:48:56.490971 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c862e2a0-256a-470f-b35b-c244555f0c5f/mysql-bootstrap/0.log" Oct 04 05:48:56 crc kubenswrapper[4574]: I1004 05:48:56.743256 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:48:56 crc kubenswrapper[4574]: E1004 05:48:56.743543 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:48:56 crc kubenswrapper[4574]: I1004 05:48:56.884503 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c862e2a0-256a-470f-b35b-c244555f0c5f/mysql-bootstrap/0.log" Oct 04 05:48:56 crc kubenswrapper[4574]: I1004 05:48:56.932962 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c862e2a0-256a-470f-b35b-c244555f0c5f/galera/0.log" Oct 04 05:48:57 crc kubenswrapper[4574]: I1004 05:48:57.427871 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2552db74-0d8b-4ca0-af2e-092c03e097f2/openstackclient/0.log" Oct 04 05:48:57 crc kubenswrapper[4574]: I1004 05:48:57.529894 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-khsmk_3836030c-f0c4-4392-bc54-cc817fd89934/ovn-controller/0.log" Oct 04 05:48:57 crc kubenswrapper[4574]: I1004 05:48:57.860958 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4576m_fb659229-980c-4368-a799-f0db3f3330da/openstack-network-exporter/0.log" Oct 04 05:48:58 crc kubenswrapper[4574]: I1004 05:48:58.167925 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovsdb-server-init/0.log" Oct 04 05:48:58 crc kubenswrapper[4574]: I1004 05:48:58.472564 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovsdb-server-init/0.log" Oct 04 05:48:58 crc kubenswrapper[4574]: I1004 05:48:58.474596 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovsdb-server/0.log" Oct 04 05:48:58 crc kubenswrapper[4574]: I1004 05:48:58.477100 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gl29s_12012c68-85d3-4063-90d2-b80d4d169f38/ovs-vswitchd/0.log" Oct 04 05:48:58 crc kubenswrapper[4574]: I1004 05:48:58.947853 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-pph2t_b1cddc5d-210f-4762-9f80-1b055ad2239b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:48:59 crc kubenswrapper[4574]: I1004 05:48:59.045640 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e306e34-ea03-4a60-9adc-99f30618be02/openstack-network-exporter/0.log" Oct 04 05:48:59 crc kubenswrapper[4574]: I1004 05:48:59.228267 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e306e34-ea03-4a60-9adc-99f30618be02/ovn-northd/0.log" Oct 04 05:48:59 crc kubenswrapper[4574]: I1004 05:48:59.436291 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cd1d5524-8818-4988-9969-45c2f2904fb4/openstack-network-exporter/0.log" Oct 04 05:48:59 crc kubenswrapper[4574]: I1004 05:48:59.588804 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_cd1d5524-8818-4988-9969-45c2f2904fb4/ovsdbserver-nb/0.log" Oct 04 05:48:59 crc kubenswrapper[4574]: I1004 05:48:59.705789 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e5b7c0f-9b1c-411c-94b0-f57b8157c998/openstack-network-exporter/0.log" Oct 04 05:48:59 crc kubenswrapper[4574]: I1004 05:48:59.954448 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e5b7c0f-9b1c-411c-94b0-f57b8157c998/ovsdbserver-sb/0.log" Oct 04 05:49:00 crc kubenswrapper[4574]: I1004 05:49:00.182468 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c85977bcb-np6n7_462b910b-39e1-4a9e-a82c-3cfe77462a97/placement-api/0.log" Oct 04 05:49:00 crc kubenswrapper[4574]: I1004 05:49:00.402520 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5c85977bcb-np6n7_462b910b-39e1-4a9e-a82c-3cfe77462a97/placement-log/0.log" Oct 04 05:49:00 crc kubenswrapper[4574]: I1004 05:49:00.583140 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bbad2653-45e8-4eb2-b7f8-60e6dcee36f2/setup-container/0.log" Oct 04 05:49:01 crc kubenswrapper[4574]: I1004 05:49:01.239824 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bbad2653-45e8-4eb2-b7f8-60e6dcee36f2/setup-container/0.log" Oct 04 05:49:01 crc kubenswrapper[4574]: I1004 05:49:01.307563 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bbad2653-45e8-4eb2-b7f8-60e6dcee36f2/rabbitmq/0.log" Oct 04 05:49:01 crc kubenswrapper[4574]: I1004 05:49:01.502904 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1298ffd0-9c09-4f29-b8bf-eaff9018fcb4/setup-container/0.log" Oct 04 05:49:01 crc kubenswrapper[4574]: I1004 05:49:01.707375 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1298ffd0-9c09-4f29-b8bf-eaff9018fcb4/setup-container/0.log" Oct 04 05:49:01 crc kubenswrapper[4574]: I1004 05:49:01.851446 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1298ffd0-9c09-4f29-b8bf-eaff9018fcb4/rabbitmq/0.log" Oct 04 05:49:02 crc kubenswrapper[4574]: I1004 05:49:02.055948 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rv2f6_eba11170-e0cf-4e7a-8e9a-771fde74bff1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:49:02 crc kubenswrapper[4574]: I1004 05:49:02.138522 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-74r7r_5ba9a62a-eb41-401f-ac26-779fb50b276a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:49:02 crc kubenswrapper[4574]: I1004 05:49:02.437392 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7gkdf_f0a5e204-886d-416f-96ad-46cc7715e417/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:49:02 crc kubenswrapper[4574]: I1004 05:49:02.728324 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jr6kt_0cad2098-82fe-4efb-89a6-a440ad6f73dc/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:49:02 crc kubenswrapper[4574]: I1004 05:49:02.914212 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5hpm9_b51be97d-af6b-432b-a671-040de2d05471/ssh-known-hosts-edpm-deployment/0.log" Oct 04 05:49:03 crc kubenswrapper[4574]: I1004 05:49:03.283053 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7688fc9d67-qlxww_710de145-ae9a-41bf-9b90-564a1e4acee6/proxy-server/0.log" Oct 04 05:49:03 crc kubenswrapper[4574]: I1004 05:49:03.306636 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7688fc9d67-qlxww_710de145-ae9a-41bf-9b90-564a1e4acee6/proxy-httpd/0.log" Oct 04 05:49:03 crc kubenswrapper[4574]: I1004 05:49:03.502399 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gbs8h_65ae5a48-3442-4149-9dbd-ac23191fa438/swift-ring-rebalance/0.log" Oct 04 05:49:03 crc kubenswrapper[4574]: I1004 05:49:03.705450 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-auditor/0.log" Oct 04 05:49:03 crc kubenswrapper[4574]: I1004 05:49:03.817480 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-reaper/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.053428 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-replicator/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.069001 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/account-server/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.106561 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-auditor/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.633071 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-server/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.697544 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-replicator/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.767409 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/container-updater/0.log" Oct 04 05:49:04 crc kubenswrapper[4574]: I1004 05:49:04.942738 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-auditor/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.069629 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-expirer/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.135344 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-replicator/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.269177 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-server/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.377195 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/rsync/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.406811 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/object-updater/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.603932 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_74b762df-991e-4e0c-9be6-c3e468408254/swift-recon-cron/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.870498 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5d8hw_8393cfca-67a9-4740-bb68-8a6cfe3f12b4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:49:05 crc kubenswrapper[4574]: I1004 05:49:05.992885 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_20e889e6-41a7-4c36-ac15-8dc429f15aeb/tempest-tests-tempest-tests-runner/0.log" Oct 04 05:49:06 crc kubenswrapper[4574]: I1004 05:49:06.413778 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f0d647c3-a19e-44ce-9e3e-be13cf6e9586/test-operator-logs-container/0.log" Oct 04 05:49:06 crc kubenswrapper[4574]: I1004 05:49:06.611645 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z98sd_b688d23c-d5f8-4fc1-bd58-8e710dae393b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 04 05:49:08 crc kubenswrapper[4574]: I1004 05:49:08.735099 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:49:08 crc kubenswrapper[4574]: E1004 05:49:08.739397 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:49:11 crc kubenswrapper[4574]: I1004 05:49:11.716368 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_58b5f9d7-7329-4c3e-a7f6-fce81c9e7cb3/memcached/0.log" Oct 04 05:49:21 crc kubenswrapper[4574]: I1004 05:49:21.733520 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:49:21 crc kubenswrapper[4574]: E1004 05:49:21.734365 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:49:25 crc kubenswrapper[4574]: I1004 05:49:25.348996 4574 scope.go:117] "RemoveContainer" containerID="fe4ac8ac5ea2a29176521a18b49dbfad780ce490295f6a171f0758362e15ade3" Oct 04 05:49:30 crc kubenswrapper[4574]: I1004 05:49:30.439693 4574 generic.go:334] "Generic (PLEG): container finished" podID="379c7c79-8b4c-4b27-aba0-418f129cbaa4" containerID="0bd08e80c06896a729fb91f910c9174a8ddea1243dc924a34ec2605a3a873329" exitCode=0 Oct 04 05:49:30 crc kubenswrapper[4574]: I1004 05:49:30.439786 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-thlps" event={"ID":"379c7c79-8b4c-4b27-aba0-418f129cbaa4","Type":"ContainerDied","Data":"0bd08e80c06896a729fb91f910c9174a8ddea1243dc924a34ec2605a3a873329"} Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.556816 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.588647 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-thlps"] Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.599471 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-thlps"] Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.671373 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh6bm\" (UniqueName: \"kubernetes.io/projected/379c7c79-8b4c-4b27-aba0-418f129cbaa4-kube-api-access-dh6bm\") pod \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.671600 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/379c7c79-8b4c-4b27-aba0-418f129cbaa4-host\") pod \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\" (UID: \"379c7c79-8b4c-4b27-aba0-418f129cbaa4\") " Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.671651 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/379c7c79-8b4c-4b27-aba0-418f129cbaa4-host" (OuterVolumeSpecName: "host") pod "379c7c79-8b4c-4b27-aba0-418f129cbaa4" (UID: "379c7c79-8b4c-4b27-aba0-418f129cbaa4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.672351 4574 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/379c7c79-8b4c-4b27-aba0-418f129cbaa4-host\") on node \"crc\" DevicePath \"\"" Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.680690 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379c7c79-8b4c-4b27-aba0-418f129cbaa4-kube-api-access-dh6bm" (OuterVolumeSpecName: "kube-api-access-dh6bm") pod "379c7c79-8b4c-4b27-aba0-418f129cbaa4" (UID: "379c7c79-8b4c-4b27-aba0-418f129cbaa4"). InnerVolumeSpecName "kube-api-access-dh6bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:49:31 crc kubenswrapper[4574]: I1004 05:49:31.774800 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh6bm\" (UniqueName: \"kubernetes.io/projected/379c7c79-8b4c-4b27-aba0-418f129cbaa4-kube-api-access-dh6bm\") on node \"crc\" DevicePath \"\"" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.457576 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eac791112b79ecb120e4caa4d874ae05792606cff8c6b049b86bf728f42a73f8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.457646 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-thlps" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.733953 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:49:32 crc kubenswrapper[4574]: E1004 05:49:32.734213 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.743570 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379c7c79-8b4c-4b27-aba0-418f129cbaa4" path="/var/lib/kubelet/pods/379c7c79-8b4c-4b27-aba0-418f129cbaa4/volumes" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.771599 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-5k4r8"] Oct 04 05:49:32 crc kubenswrapper[4574]: E1004 05:49:32.772157 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c7c79-8b4c-4b27-aba0-418f129cbaa4" containerName="container-00" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.772223 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c7c79-8b4c-4b27-aba0-418f129cbaa4" containerName="container-00" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.772603 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="379c7c79-8b4c-4b27-aba0-418f129cbaa4" containerName="container-00" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.773326 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.775738 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xmxsr"/"default-dockercfg-mgss7" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.803585 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cff853b-18d6-491e-81df-cbdf0595eb32-host\") pod \"crc-debug-5k4r8\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.803724 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9sjk\" (UniqueName: \"kubernetes.io/projected/9cff853b-18d6-491e-81df-cbdf0595eb32-kube-api-access-r9sjk\") pod \"crc-debug-5k4r8\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.905475 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cff853b-18d6-491e-81df-cbdf0595eb32-host\") pod \"crc-debug-5k4r8\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.905567 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9sjk\" (UniqueName: \"kubernetes.io/projected/9cff853b-18d6-491e-81df-cbdf0595eb32-kube-api-access-r9sjk\") pod \"crc-debug-5k4r8\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.905620 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cff853b-18d6-491e-81df-cbdf0595eb32-host\") pod \"crc-debug-5k4r8\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:32 crc kubenswrapper[4574]: I1004 05:49:32.923072 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9sjk\" (UniqueName: \"kubernetes.io/projected/9cff853b-18d6-491e-81df-cbdf0595eb32-kube-api-access-r9sjk\") pod \"crc-debug-5k4r8\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:33 crc kubenswrapper[4574]: I1004 05:49:33.089636 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:33 crc kubenswrapper[4574]: I1004 05:49:33.467852 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" event={"ID":"9cff853b-18d6-491e-81df-cbdf0595eb32","Type":"ContainerStarted","Data":"cafc4e6fe53bb50d175e6c212e1d72ee7adf081cb2ef56c436e8e163020e6a23"} Oct 04 05:49:33 crc kubenswrapper[4574]: I1004 05:49:33.468100 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" event={"ID":"9cff853b-18d6-491e-81df-cbdf0595eb32","Type":"ContainerStarted","Data":"0ea5a9a9df4045e3d78b808fa4545f632f5edf29a9dcabbec0c0328b73551a0b"} Oct 04 05:49:33 crc kubenswrapper[4574]: I1004 05:49:33.489627 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" podStartSLOduration=1.489607194 podStartE2EDuration="1.489607194s" podCreationTimestamp="2025-10-04 05:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-04 05:49:33.478891335 +0000 UTC m=+3799.333034377" watchObservedRunningTime="2025-10-04 05:49:33.489607194 +0000 UTC m=+3799.343750236" Oct 04 05:49:34 crc kubenswrapper[4574]: I1004 05:49:34.485313 4574 generic.go:334] "Generic (PLEG): container finished" podID="9cff853b-18d6-491e-81df-cbdf0595eb32" containerID="cafc4e6fe53bb50d175e6c212e1d72ee7adf081cb2ef56c436e8e163020e6a23" exitCode=0 Oct 04 05:49:34 crc kubenswrapper[4574]: I1004 05:49:34.485394 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" event={"ID":"9cff853b-18d6-491e-81df-cbdf0595eb32","Type":"ContainerDied","Data":"cafc4e6fe53bb50d175e6c212e1d72ee7adf081cb2ef56c436e8e163020e6a23"} Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.630593 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.660793 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cff853b-18d6-491e-81df-cbdf0595eb32-host\") pod \"9cff853b-18d6-491e-81df-cbdf0595eb32\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.660930 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cff853b-18d6-491e-81df-cbdf0595eb32-host" (OuterVolumeSpecName: "host") pod "9cff853b-18d6-491e-81df-cbdf0595eb32" (UID: "9cff853b-18d6-491e-81df-cbdf0595eb32"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.660957 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9sjk\" (UniqueName: \"kubernetes.io/projected/9cff853b-18d6-491e-81df-cbdf0595eb32-kube-api-access-r9sjk\") pod \"9cff853b-18d6-491e-81df-cbdf0595eb32\" (UID: \"9cff853b-18d6-491e-81df-cbdf0595eb32\") " Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.661652 4574 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cff853b-18d6-491e-81df-cbdf0595eb32-host\") on node \"crc\" DevicePath \"\"" Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.675354 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cff853b-18d6-491e-81df-cbdf0595eb32-kube-api-access-r9sjk" (OuterVolumeSpecName: "kube-api-access-r9sjk") pod "9cff853b-18d6-491e-81df-cbdf0595eb32" (UID: "9cff853b-18d6-491e-81df-cbdf0595eb32"). InnerVolumeSpecName "kube-api-access-r9sjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:49:35 crc kubenswrapper[4574]: I1004 05:49:35.764151 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9sjk\" (UniqueName: \"kubernetes.io/projected/9cff853b-18d6-491e-81df-cbdf0595eb32-kube-api-access-r9sjk\") on node \"crc\" DevicePath \"\"" Oct 04 05:49:36 crc kubenswrapper[4574]: I1004 05:49:36.514294 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" event={"ID":"9cff853b-18d6-491e-81df-cbdf0595eb32","Type":"ContainerDied","Data":"0ea5a9a9df4045e3d78b808fa4545f632f5edf29a9dcabbec0c0328b73551a0b"} Oct 04 05:49:36 crc kubenswrapper[4574]: I1004 05:49:36.514333 4574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ea5a9a9df4045e3d78b808fa4545f632f5edf29a9dcabbec0c0328b73551a0b" Oct 04 05:49:36 crc kubenswrapper[4574]: I1004 05:49:36.514338 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-5k4r8" Oct 04 05:49:40 crc kubenswrapper[4574]: I1004 05:49:40.101161 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-5k4r8"] Oct 04 05:49:40 crc kubenswrapper[4574]: I1004 05:49:40.110785 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-5k4r8"] Oct 04 05:49:40 crc kubenswrapper[4574]: I1004 05:49:40.744653 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cff853b-18d6-491e-81df-cbdf0595eb32" path="/var/lib/kubelet/pods/9cff853b-18d6-491e-81df-cbdf0595eb32/volumes" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.260141 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-fzjsf"] Oct 04 05:49:41 crc kubenswrapper[4574]: E1004 05:49:41.261012 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cff853b-18d6-491e-81df-cbdf0595eb32" containerName="container-00" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.261033 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cff853b-18d6-491e-81df-cbdf0595eb32" containerName="container-00" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.261256 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cff853b-18d6-491e-81df-cbdf0595eb32" containerName="container-00" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.262040 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.264009 4574 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xmxsr"/"default-dockercfg-mgss7" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.362595 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0779352e-899f-4a6b-a1c3-ceab1c0612df-host\") pod \"crc-debug-fzjsf\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.362671 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4cn\" (UniqueName: \"kubernetes.io/projected/0779352e-899f-4a6b-a1c3-ceab1c0612df-kube-api-access-zf4cn\") pod \"crc-debug-fzjsf\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.464334 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0779352e-899f-4a6b-a1c3-ceab1c0612df-host\") pod \"crc-debug-fzjsf\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.464388 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4cn\" (UniqueName: \"kubernetes.io/projected/0779352e-899f-4a6b-a1c3-ceab1c0612df-kube-api-access-zf4cn\") pod \"crc-debug-fzjsf\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.464479 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0779352e-899f-4a6b-a1c3-ceab1c0612df-host\") pod \"crc-debug-fzjsf\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.481430 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4cn\" (UniqueName: \"kubernetes.io/projected/0779352e-899f-4a6b-a1c3-ceab1c0612df-kube-api-access-zf4cn\") pod \"crc-debug-fzjsf\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:41 crc kubenswrapper[4574]: I1004 05:49:41.579685 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:42 crc kubenswrapper[4574]: I1004 05:49:42.567105 4574 generic.go:334] "Generic (PLEG): container finished" podID="0779352e-899f-4a6b-a1c3-ceab1c0612df" containerID="04a54711792a9aeb8388db96cc38c770497596d554952ebe808af3cafaf145ad" exitCode=0 Oct 04 05:49:42 crc kubenswrapper[4574]: I1004 05:49:42.567155 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" event={"ID":"0779352e-899f-4a6b-a1c3-ceab1c0612df","Type":"ContainerDied","Data":"04a54711792a9aeb8388db96cc38c770497596d554952ebe808af3cafaf145ad"} Oct 04 05:49:42 crc kubenswrapper[4574]: I1004 05:49:42.567477 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" event={"ID":"0779352e-899f-4a6b-a1c3-ceab1c0612df","Type":"ContainerStarted","Data":"5531ecf048a314ede3dd25955a4e53c3ddb26a09bd808f2c1641ab0374277677"} Oct 04 05:49:42 crc kubenswrapper[4574]: I1004 05:49:42.606564 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-fzjsf"] Oct 04 05:49:42 crc kubenswrapper[4574]: I1004 05:49:42.615585 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmxsr/crc-debug-fzjsf"] Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.673820 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.733507 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:49:43 crc kubenswrapper[4574]: E1004 05:49:43.733903 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.803624 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0779352e-899f-4a6b-a1c3-ceab1c0612df-host\") pod \"0779352e-899f-4a6b-a1c3-ceab1c0612df\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.804023 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf4cn\" (UniqueName: \"kubernetes.io/projected/0779352e-899f-4a6b-a1c3-ceab1c0612df-kube-api-access-zf4cn\") pod \"0779352e-899f-4a6b-a1c3-ceab1c0612df\" (UID: \"0779352e-899f-4a6b-a1c3-ceab1c0612df\") " Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.805445 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0779352e-899f-4a6b-a1c3-ceab1c0612df-host" (OuterVolumeSpecName: "host") pod "0779352e-899f-4a6b-a1c3-ceab1c0612df" (UID: "0779352e-899f-4a6b-a1c3-ceab1c0612df"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.823213 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0779352e-899f-4a6b-a1c3-ceab1c0612df-kube-api-access-zf4cn" (OuterVolumeSpecName: "kube-api-access-zf4cn") pod "0779352e-899f-4a6b-a1c3-ceab1c0612df" (UID: "0779352e-899f-4a6b-a1c3-ceab1c0612df"). InnerVolumeSpecName "kube-api-access-zf4cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.906590 4574 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0779352e-899f-4a6b-a1c3-ceab1c0612df-host\") on node \"crc\" DevicePath \"\"" Oct 04 05:49:43 crc kubenswrapper[4574]: I1004 05:49:43.906623 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf4cn\" (UniqueName: \"kubernetes.io/projected/0779352e-899f-4a6b-a1c3-ceab1c0612df-kube-api-access-zf4cn\") on node \"crc\" DevicePath \"\"" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.351819 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-mgwq7_9c976366-a9b2-4720-a5ce-2aeffaf0dad2/kube-rbac-proxy/0.log" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.444105 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-mgwq7_9c976366-a9b2-4720-a5ce-2aeffaf0dad2/manager/0.log" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.592927 4574 scope.go:117] "RemoveContainer" containerID="04a54711792a9aeb8388db96cc38c770497596d554952ebe808af3cafaf145ad" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.592981 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/crc-debug-fzjsf" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.643119 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-9t5xx_4552356b-ed71-465f-beb5-26c4a63dc81d/kube-rbac-proxy/0.log" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.734304 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-9t5xx_4552356b-ed71-465f-beb5-26c4a63dc81d/manager/0.log" Oct 04 05:49:44 crc kubenswrapper[4574]: I1004 05:49:44.767661 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0779352e-899f-4a6b-a1c3-ceab1c0612df" path="/var/lib/kubelet/pods/0779352e-899f-4a6b-a1c3-ceab1c0612df/volumes" Oct 04 05:49:44 crc kubenswrapper[4574]: E1004 05:49:44.882653 4574 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0779352e_899f_4a6b_a1c3_ceab1c0612df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0779352e_899f_4a6b_a1c3_ceab1c0612df.slice/crio-5531ecf048a314ede3dd25955a4e53c3ddb26a09bd808f2c1641ab0374277677\": RecentStats: unable to find data in memory cache]" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.069441 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/util/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.328195 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/util/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.347541 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/pull/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.364503 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/pull/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.515020 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/util/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.551968 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/pull/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.553422 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dec09d91cb4d657aa20634456587036bdc1122a6cab5d48dada5906b7a96v94_f42d7d5a-0727-4798-96da-ae6e57b9f3c5/extract/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.725964 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-qbzx8_39766d86-7ab2-42ca-b6ae-0e02eb871cc3/kube-rbac-proxy/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.785472 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-qbzx8_39766d86-7ab2-42ca-b6ae-0e02eb871cc3/manager/0.log" Oct 04 05:49:45 crc kubenswrapper[4574]: I1004 05:49:45.852830 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-pmvc8_52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd/kube-rbac-proxy/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.002298 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-pmvc8_52c00fa4-a69f-4f76-9b82-ee7fdcc3a0fd/manager/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.077355 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-mdh2j_d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7/manager/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.101977 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-mdh2j_d5f472c8-8d6c-46f0-bed2-ff2b19f3fcf7/kube-rbac-proxy/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.251303 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-96hsk_d552b4e4-9120-4d96-8615-fa6d68a71042/kube-rbac-proxy/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.318231 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-96hsk_d552b4e4-9120-4d96-8615-fa6d68a71042/manager/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.446487 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-gnpjd_e288039e-c6d3-4911-b284-1eb1cd2bccf2/kube-rbac-proxy/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.578892 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-gnpjd_e288039e-c6d3-4911-b284-1eb1cd2bccf2/manager/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.643620 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-xcjwv_d4f548d4-c2a0-4756-a55a-3d398b81d923/kube-rbac-proxy/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.799491 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-xcjwv_d4f548d4-c2a0-4756-a55a-3d398b81d923/manager/0.log" Oct 04 05:49:46 crc kubenswrapper[4574]: I1004 05:49:46.859770 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c777dc986-cvjnd_55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4/kube-rbac-proxy/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.003444 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c777dc986-cvjnd_55c14b8b-0e39-40a8-8f1c-9eefffe0f3a4/manager/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.114870 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-88mfj_1edbf723-752f-416b-a922-12a73521d6f9/kube-rbac-proxy/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.161751 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-88mfj_1edbf723-752f-416b-a922-12a73521d6f9/manager/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.298307 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j_85b1921d-1572-4aff-b002-2f31c2f270b4/kube-rbac-proxy/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.359123 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-7rs7j_85b1921d-1572-4aff-b002-2f31c2f270b4/manager/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.450014 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jt72t_90b04996-9e73-45c9-a03c-59e4bedf4ff4/kube-rbac-proxy/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.550400 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-jt72t_90b04996-9e73-45c9-a03c-59e4bedf4ff4/manager/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.604282 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-t222j_95f9af94-f839-464f-8c6f-8928146b0d26/kube-rbac-proxy/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.730862 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-t222j_95f9af94-f839-464f-8c6f-8928146b0d26/manager/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.830600 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-g2kpz_54443166-57a5-4e11-914c-d9cb2f3252d7/kube-rbac-proxy/0.log" Oct 04 05:49:47 crc kubenswrapper[4574]: I1004 05:49:47.857423 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-g2kpz_54443166-57a5-4e11-914c-d9cb2f3252d7/manager/0.log" Oct 04 05:49:48 crc kubenswrapper[4574]: I1004 05:49:48.093069 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cz7492_f0b7b141-c133-4487-9ecb-fab0b12d82bb/manager/0.log" Oct 04 05:49:48 crc kubenswrapper[4574]: I1004 05:49:48.136119 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cz7492_f0b7b141-c133-4487-9ecb-fab0b12d82bb/kube-rbac-proxy/0.log" Oct 04 05:49:48 crc kubenswrapper[4574]: I1004 05:49:48.220934 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8fff4c848-5cvwf_9169e6bf-53d3-420e-bb99-b9d897653612/kube-rbac-proxy/0.log" Oct 04 05:49:48 crc kubenswrapper[4574]: I1004 05:49:48.563304 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76d7b4df79-hsvhp_6c734153-0dff-4669-ae00-bd91be75e4c6/kube-rbac-proxy/0.log" Oct 04 05:49:48 crc kubenswrapper[4574]: I1004 05:49:48.747723 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76d7b4df79-hsvhp_6c734153-0dff-4669-ae00-bd91be75e4c6/operator/0.log" Oct 04 05:49:48 crc kubenswrapper[4574]: I1004 05:49:48.834452 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nk4xx_116021ce-1084-4c34-b4b8-9499015e58c0/registry-server/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.135287 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-sxfrz_46bd489f-f708-4c7e-b697-39e9fd65a30e/kube-rbac-proxy/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.220722 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-sxfrz_46bd489f-f708-4c7e-b697-39e9fd65a30e/manager/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.422764 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-b4fbd_28570522-1dff-475f-8ab0-963f4ac14534/kube-rbac-proxy/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.536327 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-b4fbd_28570522-1dff-475f-8ab0-963f4ac14534/manager/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.561562 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8fff4c848-5cvwf_9169e6bf-53d3-420e-bb99-b9d897653612/manager/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.626259 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-g8hz2_a95cec28-a993-4f56-b540-18ad84c5bd2d/operator/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.751180 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-2fzvp_e227d829-9a02-40dd-b0c5-012a7d024253/kube-rbac-proxy/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.831266 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-2fzvp_e227d829-9a02-40dd-b0c5-012a7d024253/manager/0.log" Oct 04 05:49:49 crc kubenswrapper[4574]: I1004 05:49:49.868520 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-hfm8z_60dfec70-f10c-4d73-9933-f2cb76124090/kube-rbac-proxy/0.log" Oct 04 05:49:50 crc kubenswrapper[4574]: I1004 05:49:50.006026 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-hfm8z_60dfec70-f10c-4d73-9933-f2cb76124090/manager/0.log" Oct 04 05:49:50 crc kubenswrapper[4574]: I1004 05:49:50.063363 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mhxlg_f87750ff-5d28-4658-b7d4-bc49bcb35886/manager/0.log" Oct 04 05:49:50 crc kubenswrapper[4574]: I1004 05:49:50.086753 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mhxlg_f87750ff-5d28-4658-b7d4-bc49bcb35886/kube-rbac-proxy/0.log" Oct 04 05:49:50 crc kubenswrapper[4574]: I1004 05:49:50.272604 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-llj5f_cb68cf9f-4ba2-410a-85f7-1db627311ff6/kube-rbac-proxy/0.log" Oct 04 05:49:50 crc kubenswrapper[4574]: I1004 05:49:50.299468 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-llj5f_cb68cf9f-4ba2-410a-85f7-1db627311ff6/manager/0.log" Oct 04 05:49:57 crc kubenswrapper[4574]: I1004 05:49:57.733266 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:49:57 crc kubenswrapper[4574]: E1004 05:49:57.734083 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:50:06 crc kubenswrapper[4574]: I1004 05:50:06.329917 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x7jjx_d9424aaa-698a-43e0-ae1c-614cc4c538a6/control-plane-machine-set-operator/0.log" Oct 04 05:50:06 crc kubenswrapper[4574]: I1004 05:50:06.459698 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hkp92_da00c73e-dcd3-4fb7-aedd-77c84ea82855/kube-rbac-proxy/0.log" Oct 04 05:50:06 crc kubenswrapper[4574]: I1004 05:50:06.511865 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hkp92_da00c73e-dcd3-4fb7-aedd-77c84ea82855/machine-api-operator/0.log" Oct 04 05:50:12 crc kubenswrapper[4574]: I1004 05:50:12.734495 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:50:12 crc kubenswrapper[4574]: E1004 05:50:12.735306 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:50:18 crc kubenswrapper[4574]: I1004 05:50:18.731988 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qgjh7_58237b74-7f6c-4cd8-b9ba-df68ba8f8c0e/cert-manager-controller/0.log" Oct 04 05:50:18 crc kubenswrapper[4574]: I1004 05:50:18.879052 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-f7cbs_25a7bfba-1bab-42d6-bb47-827aeeeefdbc/cert-manager-cainjector/0.log" Oct 04 05:50:19 crc kubenswrapper[4574]: I1004 05:50:19.015575 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mfxk5_cd556473-f56f-419c-b1b9-3a59dca5f00f/cert-manager-webhook/0.log" Oct 04 05:50:23 crc kubenswrapper[4574]: I1004 05:50:23.734041 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:50:23 crc kubenswrapper[4574]: E1004 05:50:23.734944 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:50:31 crc kubenswrapper[4574]: I1004 05:50:31.105701 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-vlq89_dc0445fe-9646-4248-a71b-c0dfff8b50f2/nmstate-console-plugin/0.log" Oct 04 05:50:31 crc kubenswrapper[4574]: I1004 05:50:31.282290 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-d95j7_88957498-0f2f-4fb7-baca-fc52a6abec78/nmstate-handler/0.log" Oct 04 05:50:31 crc kubenswrapper[4574]: I1004 05:50:31.390101 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6mt9v_7131c3ab-9443-4308-acef-460450511901/kube-rbac-proxy/0.log" Oct 04 05:50:31 crc kubenswrapper[4574]: I1004 05:50:31.403831 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6mt9v_7131c3ab-9443-4308-acef-460450511901/nmstate-metrics/0.log" Oct 04 05:50:31 crc kubenswrapper[4574]: I1004 05:50:31.562827 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-cxhlp_34ee31e2-d15b-4055-9e27-2ce2e9e43c28/nmstate-operator/0.log" Oct 04 05:50:31 crc kubenswrapper[4574]: I1004 05:50:31.629142 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-p9s5q_77cd30e5-9b3e-4e6a-83c1-b86c2f0d4bc6/nmstate-webhook/0.log" Oct 04 05:50:35 crc kubenswrapper[4574]: I1004 05:50:35.733213 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:50:35 crc kubenswrapper[4574]: E1004 05:50:35.735195 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:50:45 crc kubenswrapper[4574]: I1004 05:50:45.683538 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-fl9dm_7de5a0bd-8082-40f2-9288-2c5417547a96/kube-rbac-proxy/0.log" Oct 04 05:50:45 crc kubenswrapper[4574]: I1004 05:50:45.716845 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-fl9dm_7de5a0bd-8082-40f2-9288-2c5417547a96/controller/0.log" Oct 04 05:50:45 crc kubenswrapper[4574]: I1004 05:50:45.892765 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.110538 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.110863 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.157738 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.193105 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.382099 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.382285 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.410074 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.438675 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.654346 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-reloader/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.671250 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-metrics/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.671248 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/cp-frr-files/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.714133 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/controller/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.900777 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/kube-rbac-proxy/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.918674 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/frr-metrics/0.log" Oct 04 05:50:46 crc kubenswrapper[4574]: I1004 05:50:46.958763 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/kube-rbac-proxy-frr/0.log" Oct 04 05:50:47 crc kubenswrapper[4574]: I1004 05:50:47.111918 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/reloader/0.log" Oct 04 05:50:47 crc kubenswrapper[4574]: I1004 05:50:47.255863 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-2lxv7_d6ba0ff3-f7a4-4a53-9730-bd6d57a43a13/frr-k8s-webhook-server/0.log" Oct 04 05:50:47 crc kubenswrapper[4574]: I1004 05:50:47.505036 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7956f7d5bc-68jqm_cb7b54dc-1c7a-4728-aa2a-8e145fc94fb3/manager/0.log" Oct 04 05:50:47 crc kubenswrapper[4574]: I1004 05:50:47.645964 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78dd4884c9-9rbjh_0b23a9bd-b984-4ec1-b18a-9617dad3a194/webhook-server/0.log" Oct 04 05:50:47 crc kubenswrapper[4574]: I1004 05:50:47.839396 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pjq5j_d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb/kube-rbac-proxy/0.log" Oct 04 05:50:48 crc kubenswrapper[4574]: I1004 05:50:48.168144 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qfz6d_54b0a1bb-eb0c-4ff2-b41d-966594fe7504/frr/0.log" Oct 04 05:50:48 crc kubenswrapper[4574]: I1004 05:50:48.329638 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pjq5j_d7cbaf19-978e-44d4-8bfb-0e6f3224d5bb/speaker/0.log" Oct 04 05:50:50 crc kubenswrapper[4574]: I1004 05:50:50.734632 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:50:50 crc kubenswrapper[4574]: E1004 05:50:50.734827 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.327862 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/util/0.log" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.507665 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/pull/0.log" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.530011 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/util/0.log" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.569440 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/pull/0.log" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.808134 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/extract/0.log" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.832678 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/pull/0.log" Oct 04 05:51:00 crc kubenswrapper[4574]: I1004 05:51:00.846802 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d28v5h6_8493ffda-5976-4e28-9927-9bc66b26fccf/util/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.050335 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-utilities/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.192457 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-content/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.236932 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-content/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.240995 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-utilities/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.381185 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-utilities/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.407676 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/extract-content/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.699219 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-utilities/0.log" Oct 04 05:51:01 crc kubenswrapper[4574]: I1004 05:51:01.962262 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tfrp6_ba61d575-a013-4481-b936-66c5f531f238/registry-server/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.031767 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-content/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.090279 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-utilities/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.093640 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-content/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.237559 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-utilities/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.250570 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/extract-content/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.511422 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/util/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.610412 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-njffb_cbc61b4c-90a1-434b-b6ae-a845c4fa0bfd/registry-server/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.692766 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/util/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.749974 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/pull/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.790452 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/pull/0.log" Oct 04 05:51:02 crc kubenswrapper[4574]: I1004 05:51:02.981904 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/extract/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.000994 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/util/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.035961 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cwlrkw_108ad5dd-cca2-4fcd-9f61-e3337ad0da82/pull/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.252418 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-28wcm_40f47671-d6bd-402e-8003-3688245aa0ed/marketplace-operator/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.344165 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-utilities/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.514453 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-content/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.537688 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-utilities/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.572605 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-content/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.779670 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-content/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.860370 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/registry-server/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.877541 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9mdwq_0a8d9eda-f6e8-4f07-9a5c-4162010bfb9a/extract-utilities/0.log" Oct 04 05:51:03 crc kubenswrapper[4574]: I1004 05:51:03.991742 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-utilities/0.log" Oct 04 05:51:04 crc kubenswrapper[4574]: I1004 05:51:04.199899 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-utilities/0.log" Oct 04 05:51:04 crc kubenswrapper[4574]: I1004 05:51:04.255968 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-content/0.log" Oct 04 05:51:04 crc kubenswrapper[4574]: I1004 05:51:04.326165 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-content/0.log" Oct 04 05:51:04 crc kubenswrapper[4574]: I1004 05:51:04.480870 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-utilities/0.log" Oct 04 05:51:04 crc kubenswrapper[4574]: I1004 05:51:04.501603 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/extract-content/0.log" Oct 04 05:51:04 crc kubenswrapper[4574]: I1004 05:51:04.978782 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qqtj9_e881d007-aeba-48d9-8470-62ff6311df35/registry-server/0.log" Oct 04 05:51:05 crc kubenswrapper[4574]: I1004 05:51:05.733185 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:51:05 crc kubenswrapper[4574]: E1004 05:51:05.733790 4574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wl5xt_openshift-machine-config-operator(75910bdc-1940-4d15-b390-4bcfcec9f72c)\"" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" Oct 04 05:51:19 crc kubenswrapper[4574]: I1004 05:51:19.740890 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41" Oct 04 05:51:20 crc kubenswrapper[4574]: I1004 05:51:20.467025 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"69e8fcd7aa35e9b0ed87acc47978ec5c3c20ef3794b3a68222e8dbdb4bd47f56"} Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.364321 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d9f4p"] Oct 04 05:51:41 crc kubenswrapper[4574]: E1004 05:51:41.365330 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0779352e-899f-4a6b-a1c3-ceab1c0612df" containerName="container-00" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.365348 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0779352e-899f-4a6b-a1c3-ceab1c0612df" containerName="container-00" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.365593 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="0779352e-899f-4a6b-a1c3-ceab1c0612df" containerName="container-00" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.367828 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.448862 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d9f4p"] Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.490292 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nznqh\" (UniqueName: \"kubernetes.io/projected/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-kube-api-access-nznqh\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.490513 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-utilities\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.490572 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-catalog-content\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.592603 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nznqh\" (UniqueName: \"kubernetes.io/projected/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-kube-api-access-nznqh\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.593135 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-utilities\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.593191 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-catalog-content\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.593654 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-utilities\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.593716 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-catalog-content\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.616070 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nznqh\" (UniqueName: \"kubernetes.io/projected/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-kube-api-access-nznqh\") pod \"certified-operators-d9f4p\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:41 crc kubenswrapper[4574]: I1004 05:51:41.687960 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:42 crc kubenswrapper[4574]: I1004 05:51:42.328916 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d9f4p"] Oct 04 05:51:42 crc kubenswrapper[4574]: I1004 05:51:42.668558 4574 generic.go:334] "Generic (PLEG): container finished" podID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerID="c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b" exitCode=0 Oct 04 05:51:42 crc kubenswrapper[4574]: I1004 05:51:42.668730 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerDied","Data":"c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b"} Oct 04 05:51:42 crc kubenswrapper[4574]: I1004 05:51:42.669877 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerStarted","Data":"d3e3c04547c2d3849bb129ce9384094237e7e6feb2dc3769f4f104c7f3a054b7"} Oct 04 05:51:42 crc kubenswrapper[4574]: I1004 05:51:42.671397 4574 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 04 05:51:43 crc kubenswrapper[4574]: I1004 05:51:43.678574 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerStarted","Data":"31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec"} Oct 04 05:51:44 crc kubenswrapper[4574]: I1004 05:51:44.689388 4574 generic.go:334] "Generic (PLEG): container finished" podID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerID="31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec" exitCode=0 Oct 04 05:51:44 crc kubenswrapper[4574]: I1004 05:51:44.689561 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerDied","Data":"31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec"} Oct 04 05:51:45 crc kubenswrapper[4574]: I1004 05:51:45.722758 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerStarted","Data":"07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7"} Oct 04 05:51:45 crc kubenswrapper[4574]: I1004 05:51:45.742523 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d9f4p" podStartSLOduration=2.270298415 podStartE2EDuration="4.742506333s" podCreationTimestamp="2025-10-04 05:51:41 +0000 UTC" firstStartedPulling="2025-10-04 05:51:42.671138856 +0000 UTC m=+3928.525281898" lastFinishedPulling="2025-10-04 05:51:45.143346754 +0000 UTC m=+3930.997489816" observedRunningTime="2025-10-04 05:51:45.741422912 +0000 UTC m=+3931.595565964" watchObservedRunningTime="2025-10-04 05:51:45.742506333 +0000 UTC m=+3931.596649375" Oct 04 05:51:51 crc kubenswrapper[4574]: I1004 05:51:51.689686 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:51 crc kubenswrapper[4574]: I1004 05:51:51.691687 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:51 crc kubenswrapper[4574]: I1004 05:51:51.746137 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:51 crc kubenswrapper[4574]: I1004 05:51:51.824212 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:51 crc kubenswrapper[4574]: I1004 05:51:51.991394 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d9f4p"] Oct 04 05:51:53 crc kubenswrapper[4574]: I1004 05:51:53.790353 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d9f4p" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="registry-server" containerID="cri-o://07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7" gracePeriod=2 Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.304718 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.491760 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-catalog-content\") pod \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.491891 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-utilities\") pod \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.491956 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nznqh\" (UniqueName: \"kubernetes.io/projected/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-kube-api-access-nznqh\") pod \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\" (UID: \"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5\") " Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.493983 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-utilities" (OuterVolumeSpecName: "utilities") pod "558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" (UID: "558bcf6c-6a23-4f2f-b2e1-228f680d4ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.511733 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-kube-api-access-nznqh" (OuterVolumeSpecName: "kube-api-access-nznqh") pod "558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" (UID: "558bcf6c-6a23-4f2f-b2e1-228f680d4ae5"). InnerVolumeSpecName "kube-api-access-nznqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.548647 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" (UID: "558bcf6c-6a23-4f2f-b2e1-228f680d4ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.593811 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.593845 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nznqh\" (UniqueName: \"kubernetes.io/projected/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-kube-api-access-nznqh\") on node \"crc\" DevicePath \"\"" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.593860 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.802247 4574 generic.go:334] "Generic (PLEG): container finished" podID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerID="07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7" exitCode=0 Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.802498 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerDied","Data":"07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7"} Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.803526 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9f4p" event={"ID":"558bcf6c-6a23-4f2f-b2e1-228f680d4ae5","Type":"ContainerDied","Data":"d3e3c04547c2d3849bb129ce9384094237e7e6feb2dc3769f4f104c7f3a054b7"} Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.803647 4574 scope.go:117] "RemoveContainer" containerID="07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.802601 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9f4p" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.832115 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d9f4p"] Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.840409 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d9f4p"] Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.844486 4574 scope.go:117] "RemoveContainer" containerID="31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.867401 4574 scope.go:117] "RemoveContainer" containerID="c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.906000 4574 scope.go:117] "RemoveContainer" containerID="07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7" Oct 04 05:51:54 crc kubenswrapper[4574]: E1004 05:51:54.906420 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7\": container with ID starting with 07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7 not found: ID does not exist" containerID="07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.906449 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7"} err="failed to get container status \"07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7\": rpc error: code = NotFound desc = could not find container \"07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7\": container with ID starting with 07bfa352c4f1bbcd13f8eb5d4fe6cfeae3b442fa19b7a2479603887565fd04f7 not found: ID does not exist" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.906493 4574 scope.go:117] "RemoveContainer" containerID="31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec" Oct 04 05:51:54 crc kubenswrapper[4574]: E1004 05:51:54.907018 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec\": container with ID starting with 31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec not found: ID does not exist" containerID="31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.907051 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec"} err="failed to get container status \"31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec\": rpc error: code = NotFound desc = could not find container \"31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec\": container with ID starting with 31cbcdf38897e211b356695e805d106382ebb0bd83bcbfb05825edd624b2a4ec not found: ID does not exist" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.907075 4574 scope.go:117] "RemoveContainer" containerID="c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b" Oct 04 05:51:54 crc kubenswrapper[4574]: E1004 05:51:54.907423 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b\": container with ID starting with c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b not found: ID does not exist" containerID="c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b" Oct 04 05:51:54 crc kubenswrapper[4574]: I1004 05:51:54.907449 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b"} err="failed to get container status \"c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b\": rpc error: code = NotFound desc = could not find container \"c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b\": container with ID starting with c4892985cd0ea0a5629998deaa75fcdaa214eff5607459c67502afc42b8a9e1b not found: ID does not exist" Oct 04 05:51:56 crc kubenswrapper[4574]: I1004 05:51:56.758809 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" path="/var/lib/kubelet/pods/558bcf6c-6a23-4f2f-b2e1-228f680d4ae5/volumes" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.122011 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q2nqr"] Oct 04 05:52:33 crc kubenswrapper[4574]: E1004 05:52:33.123058 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="registry-server" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.123073 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="registry-server" Oct 04 05:52:33 crc kubenswrapper[4574]: E1004 05:52:33.123097 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="extract-utilities" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.123103 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="extract-utilities" Oct 04 05:52:33 crc kubenswrapper[4574]: E1004 05:52:33.123130 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="extract-content" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.123137 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="extract-content" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.123341 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="558bcf6c-6a23-4f2f-b2e1-228f680d4ae5" containerName="registry-server" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.124798 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.131298 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2nqr"] Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.233006 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-catalog-content\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.233088 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-utilities\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.234044 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spcmx\" (UniqueName: \"kubernetes.io/projected/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-kube-api-access-spcmx\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.336356 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spcmx\" (UniqueName: \"kubernetes.io/projected/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-kube-api-access-spcmx\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.338697 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-catalog-content\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.338820 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-utilities\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.339591 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-utilities\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.339849 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-catalog-content\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.360653 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spcmx\" (UniqueName: \"kubernetes.io/projected/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-kube-api-access-spcmx\") pod \"redhat-operators-q2nqr\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.460786 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:33 crc kubenswrapper[4574]: I1004 05:52:33.830839 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2nqr"] Oct 04 05:52:34 crc kubenswrapper[4574]: I1004 05:52:34.152061 4574 generic.go:334] "Generic (PLEG): container finished" podID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerID="d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1" exitCode=0 Oct 04 05:52:34 crc kubenswrapper[4574]: I1004 05:52:34.152417 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerDied","Data":"d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1"} Oct 04 05:52:34 crc kubenswrapper[4574]: I1004 05:52:34.153515 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerStarted","Data":"cee8df0caa81fab9bddba4c6d781c08bf674499252f29851c0142993e709b37b"} Oct 04 05:52:36 crc kubenswrapper[4574]: I1004 05:52:36.186117 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerStarted","Data":"6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d"} Oct 04 05:52:38 crc kubenswrapper[4574]: I1004 05:52:38.205479 4574 generic.go:334] "Generic (PLEG): container finished" podID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerID="6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d" exitCode=0 Oct 04 05:52:38 crc kubenswrapper[4574]: I1004 05:52:38.205996 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerDied","Data":"6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d"} Oct 04 05:52:39 crc kubenswrapper[4574]: I1004 05:52:39.217640 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerStarted","Data":"8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c"} Oct 04 05:52:39 crc kubenswrapper[4574]: I1004 05:52:39.245164 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q2nqr" podStartSLOduration=1.59339385 podStartE2EDuration="6.245142081s" podCreationTimestamp="2025-10-04 05:52:33 +0000 UTC" firstStartedPulling="2025-10-04 05:52:34.154308281 +0000 UTC m=+3980.008451323" lastFinishedPulling="2025-10-04 05:52:38.806056512 +0000 UTC m=+3984.660199554" observedRunningTime="2025-10-04 05:52:39.238930203 +0000 UTC m=+3985.093073245" watchObservedRunningTime="2025-10-04 05:52:39.245142081 +0000 UTC m=+3985.099285123" Oct 04 05:52:43 crc kubenswrapper[4574]: I1004 05:52:43.461666 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:43 crc kubenswrapper[4574]: I1004 05:52:43.463733 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:43 crc kubenswrapper[4574]: I1004 05:52:43.510796 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:44 crc kubenswrapper[4574]: I1004 05:52:44.322378 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:44 crc kubenswrapper[4574]: I1004 05:52:44.367676 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2nqr"] Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.285972 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q2nqr" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="registry-server" containerID="cri-o://8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c" gracePeriod=2 Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.756618 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.914549 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-catalog-content\") pod \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.914832 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spcmx\" (UniqueName: \"kubernetes.io/projected/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-kube-api-access-spcmx\") pod \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.914887 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-utilities\") pod \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\" (UID: \"35a6aae3-bd6f-4715-ac57-ae7fcc2624db\") " Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.916192 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-utilities" (OuterVolumeSpecName: "utilities") pod "35a6aae3-bd6f-4715-ac57-ae7fcc2624db" (UID: "35a6aae3-bd6f-4715-ac57-ae7fcc2624db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:52:46 crc kubenswrapper[4574]: I1004 05:52:46.925001 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-kube-api-access-spcmx" (OuterVolumeSpecName: "kube-api-access-spcmx") pod "35a6aae3-bd6f-4715-ac57-ae7fcc2624db" (UID: "35a6aae3-bd6f-4715-ac57-ae7fcc2624db"). InnerVolumeSpecName "kube-api-access-spcmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.012284 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35a6aae3-bd6f-4715-ac57-ae7fcc2624db" (UID: "35a6aae3-bd6f-4715-ac57-ae7fcc2624db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.017879 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spcmx\" (UniqueName: \"kubernetes.io/projected/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-kube-api-access-spcmx\") on node \"crc\" DevicePath \"\"" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.017912 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.017922 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a6aae3-bd6f-4715-ac57-ae7fcc2624db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.296689 4574 generic.go:334] "Generic (PLEG): container finished" podID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerID="8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c" exitCode=0 Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.296744 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerDied","Data":"8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c"} Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.296779 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2nqr" event={"ID":"35a6aae3-bd6f-4715-ac57-ae7fcc2624db","Type":"ContainerDied","Data":"cee8df0caa81fab9bddba4c6d781c08bf674499252f29851c0142993e709b37b"} Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.296798 4574 scope.go:117] "RemoveContainer" containerID="8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.296979 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2nqr" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.335735 4574 scope.go:117] "RemoveContainer" containerID="6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.342892 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2nqr"] Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.351934 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q2nqr"] Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.357599 4574 scope.go:117] "RemoveContainer" containerID="d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.407711 4574 scope.go:117] "RemoveContainer" containerID="8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c" Oct 04 05:52:47 crc kubenswrapper[4574]: E1004 05:52:47.408196 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c\": container with ID starting with 8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c not found: ID does not exist" containerID="8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.408254 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c"} err="failed to get container status \"8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c\": rpc error: code = NotFound desc = could not find container \"8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c\": container with ID starting with 8263e79f7e8dd10091d4a53c4e2c1d7ebcced94e45f3b9c15a3ff0a635260c2c not found: ID does not exist" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.408284 4574 scope.go:117] "RemoveContainer" containerID="6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d" Oct 04 05:52:47 crc kubenswrapper[4574]: E1004 05:52:47.408877 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d\": container with ID starting with 6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d not found: ID does not exist" containerID="6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.408901 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d"} err="failed to get container status \"6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d\": rpc error: code = NotFound desc = could not find container \"6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d\": container with ID starting with 6f0fd132bc6c065978519df322e1fea61ddda7979d37f94070336284a41a097d not found: ID does not exist" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.408914 4574 scope.go:117] "RemoveContainer" containerID="d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1" Oct 04 05:52:47 crc kubenswrapper[4574]: E1004 05:52:47.409565 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1\": container with ID starting with d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1 not found: ID does not exist" containerID="d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1" Oct 04 05:52:47 crc kubenswrapper[4574]: I1004 05:52:47.409596 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1"} err="failed to get container status \"d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1\": rpc error: code = NotFound desc = could not find container \"d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1\": container with ID starting with d44a5189b3f3817e2a84e2cde549c32c5a49ffede185f496e5f40aefb071b4a1 not found: ID does not exist" Oct 04 05:52:48 crc kubenswrapper[4574]: I1004 05:52:48.752309 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" path="/var/lib/kubelet/pods/35a6aae3-bd6f-4715-ac57-ae7fcc2624db/volumes" Oct 04 05:53:22 crc kubenswrapper[4574]: I1004 05:53:22.635482 4574 generic.go:334] "Generic (PLEG): container finished" podID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerID="fb4f67df1549d103da45f2f05b466be2878af655b2fe5fc6ab7dd05bed02e051" exitCode=0 Oct 04 05:53:22 crc kubenswrapper[4574]: I1004 05:53:22.635624 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmxsr/must-gather-6kx54" event={"ID":"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65","Type":"ContainerDied","Data":"fb4f67df1549d103da45f2f05b466be2878af655b2fe5fc6ab7dd05bed02e051"} Oct 04 05:53:22 crc kubenswrapper[4574]: I1004 05:53:22.636888 4574 scope.go:117] "RemoveContainer" containerID="fb4f67df1549d103da45f2f05b466be2878af655b2fe5fc6ab7dd05bed02e051" Oct 04 05:53:23 crc kubenswrapper[4574]: I1004 05:53:23.247880 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmxsr_must-gather-6kx54_1d1bef80-285c-4f5d-9ea8-f46ed07e3d65/gather/0.log" Oct 04 05:53:25 crc kubenswrapper[4574]: I1004 05:53:25.499318 4574 scope.go:117] "RemoveContainer" containerID="0bd08e80c06896a729fb91f910c9174a8ddea1243dc924a34ec2605a3a873329" Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.434454 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmxsr/must-gather-6kx54"] Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.435592 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xmxsr/must-gather-6kx54" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="copy" containerID="cri-o://3ac7d10566717c18113c4e4215038853490f021e92ce89082b4652f5ddbfbd21" gracePeriod=2 Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.444692 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmxsr/must-gather-6kx54"] Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.777331 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmxsr_must-gather-6kx54_1d1bef80-285c-4f5d-9ea8-f46ed07e3d65/copy/0.log" Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.778123 4574 generic.go:334] "Generic (PLEG): container finished" podID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerID="3ac7d10566717c18113c4e4215038853490f021e92ce89082b4652f5ddbfbd21" exitCode=143 Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.910484 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmxsr_must-gather-6kx54_1d1bef80-285c-4f5d-9ea8-f46ed07e3d65/copy/0.log" Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.910955 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.959154 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-must-gather-output\") pod \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.959363 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj4kl\" (UniqueName: \"kubernetes.io/projected/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-kube-api-access-gj4kl\") pod \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\" (UID: \"1d1bef80-285c-4f5d-9ea8-f46ed07e3d65\") " Oct 04 05:53:35 crc kubenswrapper[4574]: I1004 05:53:35.978685 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-kube-api-access-gj4kl" (OuterVolumeSpecName: "kube-api-access-gj4kl") pod "1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" (UID: "1d1bef80-285c-4f5d-9ea8-f46ed07e3d65"). InnerVolumeSpecName "kube-api-access-gj4kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.061977 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj4kl\" (UniqueName: \"kubernetes.io/projected/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-kube-api-access-gj4kl\") on node \"crc\" DevicePath \"\"" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.145480 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" (UID: "1d1bef80-285c-4f5d-9ea8-f46ed07e3d65"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.167885 4574 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.743518 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" path="/var/lib/kubelet/pods/1d1bef80-285c-4f5d-9ea8-f46ed07e3d65/volumes" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.787085 4574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmxsr_must-gather-6kx54_1d1bef80-285c-4f5d-9ea8-f46ed07e3d65/copy/0.log" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.787428 4574 scope.go:117] "RemoveContainer" containerID="3ac7d10566717c18113c4e4215038853490f021e92ce89082b4652f5ddbfbd21" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.787551 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmxsr/must-gather-6kx54" Oct 04 05:53:36 crc kubenswrapper[4574]: I1004 05:53:36.817140 4574 scope.go:117] "RemoveContainer" containerID="fb4f67df1549d103da45f2f05b466be2878af655b2fe5fc6ab7dd05bed02e051" Oct 04 05:53:49 crc kubenswrapper[4574]: I1004 05:53:49.404907 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:53:49 crc kubenswrapper[4574]: I1004 05:53:49.405412 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:54:19 crc kubenswrapper[4574]: I1004 05:54:19.404845 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:54:19 crc kubenswrapper[4574]: I1004 05:54:19.405616 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.570518 4574 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c9k7d"] Oct 04 05:54:34 crc kubenswrapper[4574]: E1004 05:54:34.571545 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="copy" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571565 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="copy" Oct 04 05:54:34 crc kubenswrapper[4574]: E1004 05:54:34.571608 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="gather" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571618 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="gather" Oct 04 05:54:34 crc kubenswrapper[4574]: E1004 05:54:34.571631 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="extract-content" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571639 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="extract-content" Oct 04 05:54:34 crc kubenswrapper[4574]: E1004 05:54:34.571661 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="extract-utilities" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571670 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="extract-utilities" Oct 04 05:54:34 crc kubenswrapper[4574]: E1004 05:54:34.571682 4574 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="registry-server" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571691 4574 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="registry-server" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571942 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="copy" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571959 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a6aae3-bd6f-4715-ac57-ae7fcc2624db" containerName="registry-server" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.571980 4574 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1bef80-285c-4f5d-9ea8-f46ed07e3d65" containerName="gather" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.573700 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.582536 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9k7d"] Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.651377 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-catalog-content\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.651850 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-utilities\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.652029 4574 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtmd\" (UniqueName: \"kubernetes.io/projected/a2388ba2-7b41-4387-9838-b571bc31f41a-kube-api-access-rbtmd\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.754313 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-catalog-content\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.754650 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-utilities\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.754765 4574 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtmd\" (UniqueName: \"kubernetes.io/projected/a2388ba2-7b41-4387-9838-b571bc31f41a-kube-api-access-rbtmd\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.756374 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-utilities\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.756478 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-catalog-content\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.778462 4574 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtmd\" (UniqueName: \"kubernetes.io/projected/a2388ba2-7b41-4387-9838-b571bc31f41a-kube-api-access-rbtmd\") pod \"redhat-marketplace-c9k7d\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:34 crc kubenswrapper[4574]: I1004 05:54:34.892934 4574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:35 crc kubenswrapper[4574]: I1004 05:54:35.396834 4574 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9k7d"] Oct 04 05:54:36 crc kubenswrapper[4574]: I1004 05:54:36.260490 4574 generic.go:334] "Generic (PLEG): container finished" podID="a2388ba2-7b41-4387-9838-b571bc31f41a" containerID="f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18" exitCode=0 Oct 04 05:54:36 crc kubenswrapper[4574]: I1004 05:54:36.260600 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerDied","Data":"f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18"} Oct 04 05:54:36 crc kubenswrapper[4574]: I1004 05:54:36.260841 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerStarted","Data":"22abec7c2103217a1f866dd8843ef7f354ec09bedef7f62ca9424aaf442d7603"} Oct 04 05:54:37 crc kubenswrapper[4574]: I1004 05:54:37.273349 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerStarted","Data":"9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88"} Oct 04 05:54:38 crc kubenswrapper[4574]: I1004 05:54:38.283493 4574 generic.go:334] "Generic (PLEG): container finished" podID="a2388ba2-7b41-4387-9838-b571bc31f41a" containerID="9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88" exitCode=0 Oct 04 05:54:38 crc kubenswrapper[4574]: I1004 05:54:38.283532 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerDied","Data":"9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88"} Oct 04 05:54:39 crc kubenswrapper[4574]: I1004 05:54:39.294352 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerStarted","Data":"1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99"} Oct 04 05:54:39 crc kubenswrapper[4574]: I1004 05:54:39.321611 4574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c9k7d" podStartSLOduration=2.853414091 podStartE2EDuration="5.321593008s" podCreationTimestamp="2025-10-04 05:54:34 +0000 UTC" firstStartedPulling="2025-10-04 05:54:36.265386095 +0000 UTC m=+4102.119529137" lastFinishedPulling="2025-10-04 05:54:38.733565012 +0000 UTC m=+4104.587708054" observedRunningTime="2025-10-04 05:54:39.310138229 +0000 UTC m=+4105.164281271" watchObservedRunningTime="2025-10-04 05:54:39.321593008 +0000 UTC m=+4105.175736050" Oct 04 05:54:44 crc kubenswrapper[4574]: I1004 05:54:44.893224 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:44 crc kubenswrapper[4574]: I1004 05:54:44.893868 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:44 crc kubenswrapper[4574]: I1004 05:54:44.937657 4574 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:45 crc kubenswrapper[4574]: I1004 05:54:45.398763 4574 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:45 crc kubenswrapper[4574]: I1004 05:54:45.443177 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9k7d"] Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.372659 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c9k7d" podUID="a2388ba2-7b41-4387-9838-b571bc31f41a" containerName="registry-server" containerID="cri-o://1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99" gracePeriod=2 Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.814010 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.925272 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbtmd\" (UniqueName: \"kubernetes.io/projected/a2388ba2-7b41-4387-9838-b571bc31f41a-kube-api-access-rbtmd\") pod \"a2388ba2-7b41-4387-9838-b571bc31f41a\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.925449 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-catalog-content\") pod \"a2388ba2-7b41-4387-9838-b571bc31f41a\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.925494 4574 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-utilities\") pod \"a2388ba2-7b41-4387-9838-b571bc31f41a\" (UID: \"a2388ba2-7b41-4387-9838-b571bc31f41a\") " Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.926703 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-utilities" (OuterVolumeSpecName: "utilities") pod "a2388ba2-7b41-4387-9838-b571bc31f41a" (UID: "a2388ba2-7b41-4387-9838-b571bc31f41a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.932401 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2388ba2-7b41-4387-9838-b571bc31f41a-kube-api-access-rbtmd" (OuterVolumeSpecName: "kube-api-access-rbtmd") pod "a2388ba2-7b41-4387-9838-b571bc31f41a" (UID: "a2388ba2-7b41-4387-9838-b571bc31f41a"). InnerVolumeSpecName "kube-api-access-rbtmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 04 05:54:47 crc kubenswrapper[4574]: I1004 05:54:47.942673 4574 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2388ba2-7b41-4387-9838-b571bc31f41a" (UID: "a2388ba2-7b41-4387-9838-b571bc31f41a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.028017 4574 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.028064 4574 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2388ba2-7b41-4387-9838-b571bc31f41a-utilities\") on node \"crc\" DevicePath \"\"" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.028078 4574 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbtmd\" (UniqueName: \"kubernetes.io/projected/a2388ba2-7b41-4387-9838-b571bc31f41a-kube-api-access-rbtmd\") on node \"crc\" DevicePath \"\"" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.384501 4574 generic.go:334] "Generic (PLEG): container finished" podID="a2388ba2-7b41-4387-9838-b571bc31f41a" containerID="1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99" exitCode=0 Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.384567 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerDied","Data":"1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99"} Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.384802 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9k7d" event={"ID":"a2388ba2-7b41-4387-9838-b571bc31f41a","Type":"ContainerDied","Data":"22abec7c2103217a1f866dd8843ef7f354ec09bedef7f62ca9424aaf442d7603"} Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.384826 4574 scope.go:117] "RemoveContainer" containerID="1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.384617 4574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9k7d" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.404016 4574 scope.go:117] "RemoveContainer" containerID="9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.424263 4574 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9k7d"] Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.426737 4574 scope.go:117] "RemoveContainer" containerID="f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.443559 4574 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9k7d"] Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.465119 4574 scope.go:117] "RemoveContainer" containerID="1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99" Oct 04 05:54:48 crc kubenswrapper[4574]: E1004 05:54:48.465629 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99\": container with ID starting with 1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99 not found: ID does not exist" containerID="1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.465817 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99"} err="failed to get container status \"1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99\": rpc error: code = NotFound desc = could not find container \"1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99\": container with ID starting with 1af7d526975b7981d89a0cd06070134faba6ac4179d9cbea32a0d7b0c164aa99 not found: ID does not exist" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.465941 4574 scope.go:117] "RemoveContainer" containerID="9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88" Oct 04 05:54:48 crc kubenswrapper[4574]: E1004 05:54:48.466513 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88\": container with ID starting with 9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88 not found: ID does not exist" containerID="9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.466567 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88"} err="failed to get container status \"9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88\": rpc error: code = NotFound desc = could not find container \"9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88\": container with ID starting with 9e7e800ab09967a9375ee9acb9b8f948eb91f8c4a46d7e56734932471c0b3a88 not found: ID does not exist" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.466609 4574 scope.go:117] "RemoveContainer" containerID="f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18" Oct 04 05:54:48 crc kubenswrapper[4574]: E1004 05:54:48.467081 4574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18\": container with ID starting with f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18 not found: ID does not exist" containerID="f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.467279 4574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18"} err="failed to get container status \"f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18\": rpc error: code = NotFound desc = could not find container \"f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18\": container with ID starting with f4b60535d8a502b5c3dd8edd069c17bb5759f5a4df86e4b184f00167f57ebb18 not found: ID does not exist" Oct 04 05:54:48 crc kubenswrapper[4574]: I1004 05:54:48.743865 4574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2388ba2-7b41-4387-9838-b571bc31f41a" path="/var/lib/kubelet/pods/a2388ba2-7b41-4387-9838-b571bc31f41a/volumes" Oct 04 05:54:49 crc kubenswrapper[4574]: I1004 05:54:49.405284 4574 patch_prober.go:28] interesting pod/machine-config-daemon-wl5xt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 04 05:54:49 crc kubenswrapper[4574]: I1004 05:54:49.406126 4574 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 04 05:54:49 crc kubenswrapper[4574]: I1004 05:54:49.406306 4574 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" Oct 04 05:54:49 crc kubenswrapper[4574]: I1004 05:54:49.407046 4574 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69e8fcd7aa35e9b0ed87acc47978ec5c3c20ef3794b3a68222e8dbdb4bd47f56"} pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 04 05:54:49 crc kubenswrapper[4574]: I1004 05:54:49.407273 4574 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" podUID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerName="machine-config-daemon" containerID="cri-o://69e8fcd7aa35e9b0ed87acc47978ec5c3c20ef3794b3a68222e8dbdb4bd47f56" gracePeriod=600 Oct 04 05:54:50 crc kubenswrapper[4574]: I1004 05:54:50.406520 4574 generic.go:334] "Generic (PLEG): container finished" podID="75910bdc-1940-4d15-b390-4bcfcec9f72c" containerID="69e8fcd7aa35e9b0ed87acc47978ec5c3c20ef3794b3a68222e8dbdb4bd47f56" exitCode=0 Oct 04 05:54:50 crc kubenswrapper[4574]: I1004 05:54:50.406605 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerDied","Data":"69e8fcd7aa35e9b0ed87acc47978ec5c3c20ef3794b3a68222e8dbdb4bd47f56"} Oct 04 05:54:50 crc kubenswrapper[4574]: I1004 05:54:50.407977 4574 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wl5xt" event={"ID":"75910bdc-1940-4d15-b390-4bcfcec9f72c","Type":"ContainerStarted","Data":"aa3106991879cb88e6d9c4071901a7e7e9c1df2bf969f4f88579b7b200a86d48"} Oct 04 05:54:50 crc kubenswrapper[4574]: I1004 05:54:50.408095 4574 scope.go:117] "RemoveContainer" containerID="f2f04f69cc526e14b782e88035973ea57fef284ae204886174ed7e3576637a41"